Dec 11 04:00:43 np0005555140 kernel: Linux version 5.14.0-648.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Dec 5 11:18:23 UTC 2025
Dec 11 04:00:43 np0005555140 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 11 04:00:43 np0005555140 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 04:00:43 np0005555140 kernel: BIOS-provided physical RAM map:
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 11 04:00:43 np0005555140 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 11 04:00:43 np0005555140 kernel: NX (Execute Disable) protection: active
Dec 11 04:00:43 np0005555140 kernel: APIC: Static calls initialized
Dec 11 04:00:43 np0005555140 kernel: SMBIOS 2.8 present.
Dec 11 04:00:43 np0005555140 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 11 04:00:43 np0005555140 kernel: Hypervisor detected: KVM
Dec 11 04:00:43 np0005555140 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 11 04:00:43 np0005555140 kernel: kvm-clock: using sched offset of 3156151571 cycles
Dec 11 04:00:43 np0005555140 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 11 04:00:43 np0005555140 kernel: tsc: Detected 2800.000 MHz processor
Dec 11 04:00:43 np0005555140 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 11 04:00:43 np0005555140 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 11 04:00:43 np0005555140 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 11 04:00:43 np0005555140 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 11 04:00:43 np0005555140 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 11 04:00:43 np0005555140 kernel: Using GB pages for direct mapping
Dec 11 04:00:43 np0005555140 kernel: RAMDISK: [mem 0x2d46a000-0x32a2cfff]
Dec 11 04:00:43 np0005555140 kernel: ACPI: Early table checksum verification disabled
Dec 11 04:00:43 np0005555140 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 11 04:00:43 np0005555140 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 04:00:43 np0005555140 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 04:00:43 np0005555140 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 04:00:43 np0005555140 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 11 04:00:43 np0005555140 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 04:00:43 np0005555140 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 11 04:00:43 np0005555140 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 11 04:00:43 np0005555140 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 11 04:00:43 np0005555140 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 11 04:00:43 np0005555140 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 11 04:00:43 np0005555140 kernel: No NUMA configuration found
Dec 11 04:00:43 np0005555140 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 11 04:00:43 np0005555140 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec 11 04:00:43 np0005555140 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 11 04:00:43 np0005555140 kernel: Zone ranges:
Dec 11 04:00:43 np0005555140 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 11 04:00:43 np0005555140 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 11 04:00:43 np0005555140 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 11 04:00:43 np0005555140 kernel:  Device   empty
Dec 11 04:00:43 np0005555140 kernel: Movable zone start for each node
Dec 11 04:00:43 np0005555140 kernel: Early memory node ranges
Dec 11 04:00:43 np0005555140 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 11 04:00:43 np0005555140 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 11 04:00:43 np0005555140 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 11 04:00:43 np0005555140 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 11 04:00:43 np0005555140 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 11 04:00:43 np0005555140 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 11 04:00:43 np0005555140 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 11 04:00:43 np0005555140 kernel: ACPI: PM-Timer IO Port: 0x608
Dec 11 04:00:43 np0005555140 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 11 04:00:43 np0005555140 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 11 04:00:43 np0005555140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 11 04:00:43 np0005555140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 11 04:00:43 np0005555140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 11 04:00:43 np0005555140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 11 04:00:43 np0005555140 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 11 04:00:43 np0005555140 kernel: TSC deadline timer available
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Max. logical packages:   8
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Max. logical dies:       8
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Max. dies per package:   1
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Max. threads per core:   1
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Num. cores per package:     1
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Num. threads per package:   1
Dec 11 04:00:43 np0005555140 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 11 04:00:43 np0005555140 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 11 04:00:43 np0005555140 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 11 04:00:43 np0005555140 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 11 04:00:43 np0005555140 kernel: Booting paravirtualized kernel on KVM
Dec 11 04:00:43 np0005555140 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 11 04:00:43 np0005555140 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 11 04:00:43 np0005555140 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 11 04:00:43 np0005555140 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 11 04:00:43 np0005555140 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 04:00:43 np0005555140 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64", will be passed to user space.
Dec 11 04:00:43 np0005555140 kernel: random: crng init done
Dec 11 04:00:43 np0005555140 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: Fallback order for Node 0: 0 
Dec 11 04:00:43 np0005555140 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 11 04:00:43 np0005555140 kernel: Policy zone: Normal
Dec 11 04:00:43 np0005555140 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 11 04:00:43 np0005555140 kernel: software IO TLB: area num 8.
Dec 11 04:00:43 np0005555140 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 11 04:00:43 np0005555140 kernel: ftrace: allocating 49357 entries in 193 pages
Dec 11 04:00:43 np0005555140 kernel: ftrace: allocated 193 pages with 3 groups
Dec 11 04:00:43 np0005555140 kernel: Dynamic Preempt: voluntary
Dec 11 04:00:43 np0005555140 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 11 04:00:43 np0005555140 kernel: rcu: #011RCU event tracing is enabled.
Dec 11 04:00:43 np0005555140 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 11 04:00:43 np0005555140 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec 11 04:00:43 np0005555140 kernel: #011Rude variant of Tasks RCU enabled.
Dec 11 04:00:43 np0005555140 kernel: #011Tracing variant of Tasks RCU enabled.
Dec 11 04:00:43 np0005555140 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 11 04:00:43 np0005555140 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 11 04:00:43 np0005555140 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 04:00:43 np0005555140 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 04:00:43 np0005555140 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 04:00:43 np0005555140 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 11 04:00:43 np0005555140 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 11 04:00:43 np0005555140 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 11 04:00:43 np0005555140 kernel: Console: colour VGA+ 80x25
Dec 11 04:00:43 np0005555140 kernel: printk: console [ttyS0] enabled
Dec 11 04:00:43 np0005555140 kernel: ACPI: Core revision 20230331
Dec 11 04:00:43 np0005555140 kernel: APIC: Switch to symmetric I/O mode setup
Dec 11 04:00:43 np0005555140 kernel: x2apic enabled
Dec 11 04:00:43 np0005555140 kernel: APIC: Switched APIC routing to: physical x2apic
Dec 11 04:00:43 np0005555140 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 11 04:00:43 np0005555140 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 11 04:00:43 np0005555140 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 11 04:00:43 np0005555140 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 11 04:00:43 np0005555140 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 11 04:00:43 np0005555140 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 11 04:00:43 np0005555140 kernel: Spectre V2 : Mitigation: Retpolines
Dec 11 04:00:43 np0005555140 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 11 04:00:43 np0005555140 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 11 04:00:43 np0005555140 kernel: RETBleed: Mitigation: untrained return thunk
Dec 11 04:00:43 np0005555140 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 11 04:00:43 np0005555140 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 11 04:00:43 np0005555140 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 11 04:00:43 np0005555140 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 11 04:00:43 np0005555140 kernel: x86/bugs: return thunk changed
Dec 11 04:00:43 np0005555140 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 11 04:00:43 np0005555140 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 11 04:00:43 np0005555140 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 11 04:00:43 np0005555140 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 11 04:00:43 np0005555140 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 11 04:00:43 np0005555140 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 11 04:00:43 np0005555140 kernel: Freeing SMP alternatives memory: 40K
Dec 11 04:00:43 np0005555140 kernel: pid_max: default: 32768 minimum: 301
Dec 11 04:00:43 np0005555140 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 11 04:00:43 np0005555140 kernel: landlock: Up and running.
Dec 11 04:00:43 np0005555140 kernel: Yama: becoming mindful.
Dec 11 04:00:43 np0005555140 kernel: SELinux:  Initializing.
Dec 11 04:00:43 np0005555140 kernel: LSM support for eBPF active
Dec 11 04:00:43 np0005555140 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 11 04:00:43 np0005555140 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 11 04:00:43 np0005555140 kernel: ... version:                0
Dec 11 04:00:43 np0005555140 kernel: ... bit width:              48
Dec 11 04:00:43 np0005555140 kernel: ... generic registers:      6
Dec 11 04:00:43 np0005555140 kernel: ... value mask:             0000ffffffffffff
Dec 11 04:00:43 np0005555140 kernel: ... max period:             00007fffffffffff
Dec 11 04:00:43 np0005555140 kernel: ... fixed-purpose events:   0
Dec 11 04:00:43 np0005555140 kernel: ... event mask:             000000000000003f
Dec 11 04:00:43 np0005555140 kernel: signal: max sigframe size: 1776
Dec 11 04:00:43 np0005555140 kernel: rcu: Hierarchical SRCU implementation.
Dec 11 04:00:43 np0005555140 kernel: rcu: #011Max phase no-delay instances is 400.
Dec 11 04:00:43 np0005555140 kernel: smp: Bringing up secondary CPUs ...
Dec 11 04:00:43 np0005555140 kernel: smpboot: x86: Booting SMP configuration:
Dec 11 04:00:43 np0005555140 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 11 04:00:43 np0005555140 kernel: smp: Brought up 1 node, 8 CPUs
Dec 11 04:00:43 np0005555140 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 11 04:00:43 np0005555140 kernel: node 0 deferred pages initialised in 8ms
Dec 11 04:00:43 np0005555140 kernel: Memory: 7763948K/8388068K available (16384K kernel code, 5795K rwdata, 13916K rodata, 4192K init, 7164K bss, 618220K reserved, 0K cma-reserved)
Dec 11 04:00:43 np0005555140 kernel: devtmpfs: initialized
Dec 11 04:00:43 np0005555140 kernel: x86/mm: Memory block size: 128MB
Dec 11 04:00:43 np0005555140 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 11 04:00:43 np0005555140 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 11 04:00:43 np0005555140 kernel: pinctrl core: initialized pinctrl subsystem
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 11 04:00:43 np0005555140 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 11 04:00:43 np0005555140 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 11 04:00:43 np0005555140 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 11 04:00:43 np0005555140 kernel: audit: initializing netlink subsys (disabled)
Dec 11 04:00:43 np0005555140 kernel: audit: type=2000 audit(1765443641.606:1): state=initialized audit_enabled=0 res=1
Dec 11 04:00:43 np0005555140 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 11 04:00:43 np0005555140 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 11 04:00:43 np0005555140 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 11 04:00:43 np0005555140 kernel: cpuidle: using governor menu
Dec 11 04:00:43 np0005555140 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 11 04:00:43 np0005555140 kernel: PCI: Using configuration type 1 for base access
Dec 11 04:00:43 np0005555140 kernel: PCI: Using configuration type 1 for extended access
Dec 11 04:00:43 np0005555140 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 11 04:00:43 np0005555140 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 11 04:00:43 np0005555140 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 11 04:00:43 np0005555140 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 11 04:00:43 np0005555140 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 11 04:00:43 np0005555140 kernel: Demotion targets for Node 0: null
Dec 11 04:00:43 np0005555140 kernel: cryptd: max_cpu_qlen set to 1000
Dec 11 04:00:43 np0005555140 kernel: ACPI: Added _OSI(Module Device)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Added _OSI(Processor Device)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 11 04:00:43 np0005555140 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 11 04:00:43 np0005555140 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 11 04:00:43 np0005555140 kernel: ACPI: Interpreter enabled
Dec 11 04:00:43 np0005555140 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 11 04:00:43 np0005555140 kernel: ACPI: Using IOAPIC for interrupt routing
Dec 11 04:00:43 np0005555140 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 11 04:00:43 np0005555140 kernel: PCI: Using E820 reservations for host bridge windows
Dec 11 04:00:43 np0005555140 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 11 04:00:43 np0005555140 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [3] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [4] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [5] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [6] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [7] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [8] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [9] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [10] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [11] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [12] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [13] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [14] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [15] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [16] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [17] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [18] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [19] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [20] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [21] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [22] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [23] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [24] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [25] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [26] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [27] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [28] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [29] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [30] registered
Dec 11 04:00:43 np0005555140 kernel: acpiphp: Slot [31] registered
Dec 11 04:00:43 np0005555140 kernel: PCI host bridge to bus 0000:00
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 11 04:00:43 np0005555140 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 11 04:00:43 np0005555140 kernel: iommu: Default domain type: Translated
Dec 11 04:00:43 np0005555140 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 11 04:00:43 np0005555140 kernel: SCSI subsystem initialized
Dec 11 04:00:43 np0005555140 kernel: ACPI: bus type USB registered
Dec 11 04:00:43 np0005555140 kernel: usbcore: registered new interface driver usbfs
Dec 11 04:00:43 np0005555140 kernel: usbcore: registered new interface driver hub
Dec 11 04:00:43 np0005555140 kernel: usbcore: registered new device driver usb
Dec 11 04:00:43 np0005555140 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 11 04:00:43 np0005555140 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 11 04:00:43 np0005555140 kernel: PTP clock support registered
Dec 11 04:00:43 np0005555140 kernel: EDAC MC: Ver: 3.0.0
Dec 11 04:00:43 np0005555140 kernel: NetLabel: Initializing
Dec 11 04:00:43 np0005555140 kernel: NetLabel:  domain hash size = 128
Dec 11 04:00:43 np0005555140 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 11 04:00:43 np0005555140 kernel: NetLabel:  unlabeled traffic allowed by default
Dec 11 04:00:43 np0005555140 kernel: PCI: Using ACPI for IRQ routing
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 11 04:00:43 np0005555140 kernel: vgaarb: loaded
Dec 11 04:00:43 np0005555140 kernel: clocksource: Switched to clocksource kvm-clock
Dec 11 04:00:43 np0005555140 kernel: VFS: Disk quotas dquot_6.6.0
Dec 11 04:00:43 np0005555140 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 11 04:00:43 np0005555140 kernel: pnp: PnP ACPI init
Dec 11 04:00:43 np0005555140 kernel: pnp: PnP ACPI: found 5 devices
Dec 11 04:00:43 np0005555140 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_INET protocol family
Dec 11 04:00:43 np0005555140 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 11 04:00:43 np0005555140 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_XDP protocol family
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 11 04:00:43 np0005555140 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 11 04:00:43 np0005555140 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 11 04:00:43 np0005555140 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 74734 usecs
Dec 11 04:00:43 np0005555140 kernel: PCI: CLS 0 bytes, default 64
Dec 11 04:00:43 np0005555140 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 11 04:00:43 np0005555140 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 11 04:00:43 np0005555140 kernel: ACPI: bus type thunderbolt registered
Dec 11 04:00:43 np0005555140 kernel: Trying to unpack rootfs image as initramfs...
Dec 11 04:00:43 np0005555140 kernel: Initialise system trusted keyrings
Dec 11 04:00:43 np0005555140 kernel: Key type blacklist registered
Dec 11 04:00:43 np0005555140 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 11 04:00:43 np0005555140 kernel: zbud: loaded
Dec 11 04:00:43 np0005555140 kernel: integrity: Platform Keyring initialized
Dec 11 04:00:43 np0005555140 kernel: integrity: Machine keyring initialized
Dec 11 04:00:43 np0005555140 kernel: Freeing initrd memory: 87820K
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_ALG protocol family
Dec 11 04:00:43 np0005555140 kernel: xor: automatically using best checksumming function   avx       
Dec 11 04:00:43 np0005555140 kernel: Key type asymmetric registered
Dec 11 04:00:43 np0005555140 kernel: Asymmetric key parser 'x509' registered
Dec 11 04:00:43 np0005555140 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 11 04:00:43 np0005555140 kernel: io scheduler mq-deadline registered
Dec 11 04:00:43 np0005555140 kernel: io scheduler kyber registered
Dec 11 04:00:43 np0005555140 kernel: io scheduler bfq registered
Dec 11 04:00:43 np0005555140 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 11 04:00:43 np0005555140 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 11 04:00:43 np0005555140 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 11 04:00:43 np0005555140 kernel: ACPI: button: Power Button [PWRF]
Dec 11 04:00:43 np0005555140 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 11 04:00:43 np0005555140 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 11 04:00:43 np0005555140 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 11 04:00:43 np0005555140 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 11 04:00:43 np0005555140 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 11 04:00:43 np0005555140 kernel: Non-volatile memory driver v1.3
Dec 11 04:00:43 np0005555140 kernel: rdac: device handler registered
Dec 11 04:00:43 np0005555140 kernel: hp_sw: device handler registered
Dec 11 04:00:43 np0005555140 kernel: emc: device handler registered
Dec 11 04:00:43 np0005555140 kernel: alua: device handler registered
Dec 11 04:00:43 np0005555140 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 11 04:00:43 np0005555140 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 11 04:00:43 np0005555140 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 11 04:00:43 np0005555140 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 11 04:00:43 np0005555140 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 11 04:00:43 np0005555140 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 11 04:00:43 np0005555140 kernel: usb usb1: Product: UHCI Host Controller
Dec 11 04:00:43 np0005555140 kernel: usb usb1: Manufacturer: Linux 5.14.0-648.el9.x86_64 uhci_hcd
Dec 11 04:00:43 np0005555140 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 11 04:00:43 np0005555140 kernel: hub 1-0:1.0: USB hub found
Dec 11 04:00:43 np0005555140 kernel: hub 1-0:1.0: 2 ports detected
Dec 11 04:00:43 np0005555140 kernel: usbcore: registered new interface driver usbserial_generic
Dec 11 04:00:43 np0005555140 kernel: usbserial: USB Serial support registered for generic
Dec 11 04:00:43 np0005555140 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 11 04:00:43 np0005555140 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 11 04:00:43 np0005555140 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 11 04:00:43 np0005555140 kernel: mousedev: PS/2 mouse device common for all mice
Dec 11 04:00:43 np0005555140 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 11 04:00:43 np0005555140 kernel: rtc_cmos 00:04: registered as rtc0
Dec 11 04:00:43 np0005555140 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 11 04:00:43 np0005555140 kernel: rtc_cmos 00:04: setting system clock to 2025-12-11T09:00:42 UTC (1765443642)
Dec 11 04:00:43 np0005555140 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 11 04:00:43 np0005555140 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 11 04:00:43 np0005555140 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 11 04:00:43 np0005555140 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 11 04:00:43 np0005555140 kernel: usbcore: registered new interface driver usbhid
Dec 11 04:00:43 np0005555140 kernel: usbhid: USB HID core driver
Dec 11 04:00:43 np0005555140 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 11 04:00:43 np0005555140 kernel: drop_monitor: Initializing network drop monitor service
Dec 11 04:00:43 np0005555140 kernel: Initializing XFRM netlink socket
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_INET6 protocol family
Dec 11 04:00:43 np0005555140 kernel: Segment Routing with IPv6
Dec 11 04:00:43 np0005555140 kernel: NET: Registered PF_PACKET protocol family
Dec 11 04:00:43 np0005555140 kernel: mpls_gso: MPLS GSO support
Dec 11 04:00:43 np0005555140 kernel: IPI shorthand broadcast: enabled
Dec 11 04:00:43 np0005555140 kernel: AVX2 version of gcm_enc/dec engaged.
Dec 11 04:00:43 np0005555140 kernel: AES CTR mode by8 optimization enabled
Dec 11 04:00:43 np0005555140 kernel: sched_clock: Marking stable (1182010349, 141726550)->(1435424039, -111687140)
Dec 11 04:00:43 np0005555140 kernel: registered taskstats version 1
Dec 11 04:00:43 np0005555140 kernel: Loading compiled-in X.509 certificates
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 11 04:00:43 np0005555140 kernel: Demotion targets for Node 0: null
Dec 11 04:00:43 np0005555140 kernel: page_owner is disabled
Dec 11 04:00:43 np0005555140 kernel: Key type .fscrypt registered
Dec 11 04:00:43 np0005555140 kernel: Key type fscrypt-provisioning registered
Dec 11 04:00:43 np0005555140 kernel: Key type big_key registered
Dec 11 04:00:43 np0005555140 kernel: Key type encrypted registered
Dec 11 04:00:43 np0005555140 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 11 04:00:43 np0005555140 kernel: Loading compiled-in module X.509 certificates
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 11 04:00:43 np0005555140 kernel: ima: Allocated hash algorithm: sha256
Dec 11 04:00:43 np0005555140 kernel: ima: No architecture policies found
Dec 11 04:00:43 np0005555140 kernel: evm: Initialising EVM extended attributes:
Dec 11 04:00:43 np0005555140 kernel: evm: security.selinux
Dec 11 04:00:43 np0005555140 kernel: evm: security.SMACK64 (disabled)
Dec 11 04:00:43 np0005555140 kernel: evm: security.SMACK64EXEC (disabled)
Dec 11 04:00:43 np0005555140 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 11 04:00:43 np0005555140 kernel: evm: security.SMACK64MMAP (disabled)
Dec 11 04:00:43 np0005555140 kernel: evm: security.apparmor (disabled)
Dec 11 04:00:43 np0005555140 kernel: evm: security.ima
Dec 11 04:00:43 np0005555140 kernel: evm: security.capability
Dec 11 04:00:43 np0005555140 kernel: evm: HMAC attrs: 0x1
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 11 04:00:43 np0005555140 kernel: Running certificate verification RSA selftest
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 11 04:00:43 np0005555140 kernel: Running certificate verification ECDSA selftest
Dec 11 04:00:43 np0005555140 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 11 04:00:43 np0005555140 kernel: clk: Disabling unused clocks
Dec 11 04:00:43 np0005555140 kernel: Freeing unused decrypted memory: 2028K
Dec 11 04:00:43 np0005555140 kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec 11 04:00:43 np0005555140 kernel: Write protecting the kernel read-only data: 30720k
Dec 11 04:00:43 np0005555140 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: Product: QEMU USB Tablet
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: Manufacturer: QEMU
Dec 11 04:00:43 np0005555140 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 11 04:00:43 np0005555140 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 11 04:00:43 np0005555140 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 11 04:00:43 np0005555140 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 11 04:00:43 np0005555140 kernel: Run /init as init process
Dec 11 04:00:43 np0005555140 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 11 04:00:43 np0005555140 systemd: Detected virtualization kvm.
Dec 11 04:00:43 np0005555140 systemd: Detected architecture x86-64.
Dec 11 04:00:43 np0005555140 systemd: Running in initrd.
Dec 11 04:00:43 np0005555140 systemd: No hostname configured, using default hostname.
Dec 11 04:00:43 np0005555140 systemd: Hostname set to <localhost>.
Dec 11 04:00:43 np0005555140 systemd: Initializing machine ID from VM UUID.
Dec 11 04:00:43 np0005555140 systemd: Queued start job for default target Initrd Default Target.
Dec 11 04:00:43 np0005555140 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec 11 04:00:43 np0005555140 systemd: Reached target Local Encrypted Volumes.
Dec 11 04:00:43 np0005555140 systemd: Reached target Initrd /usr File System.
Dec 11 04:00:43 np0005555140 systemd: Reached target Local File Systems.
Dec 11 04:00:43 np0005555140 systemd: Reached target Path Units.
Dec 11 04:00:43 np0005555140 systemd: Reached target Slice Units.
Dec 11 04:00:43 np0005555140 systemd: Reached target Swaps.
Dec 11 04:00:43 np0005555140 systemd: Reached target Timer Units.
Dec 11 04:00:43 np0005555140 systemd: Listening on D-Bus System Message Bus Socket.
Dec 11 04:00:43 np0005555140 systemd: Listening on Journal Socket (/dev/log).
Dec 11 04:00:43 np0005555140 systemd: Listening on Journal Socket.
Dec 11 04:00:43 np0005555140 systemd: Listening on udev Control Socket.
Dec 11 04:00:43 np0005555140 systemd: Listening on udev Kernel Socket.
Dec 11 04:00:43 np0005555140 systemd: Reached target Socket Units.
Dec 11 04:00:43 np0005555140 systemd: Starting Create List of Static Device Nodes...
Dec 11 04:00:43 np0005555140 systemd: Starting Journal Service...
Dec 11 04:00:43 np0005555140 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 11 04:00:43 np0005555140 systemd: Starting Apply Kernel Variables...
Dec 11 04:00:43 np0005555140 systemd: Starting Create System Users...
Dec 11 04:00:43 np0005555140 systemd: Starting Setup Virtual Console...
Dec 11 04:00:43 np0005555140 systemd: Finished Create List of Static Device Nodes.
Dec 11 04:00:43 np0005555140 systemd: Finished Apply Kernel Variables.
Dec 11 04:00:43 np0005555140 systemd: Finished Create System Users.
Dec 11 04:00:43 np0005555140 systemd-journald[306]: Journal started
Dec 11 04:00:43 np0005555140 systemd-journald[306]: Runtime Journal (/run/log/journal/8f17f30a28b844d4928bb22923e377d2) is 8.0M, max 153.6M, 145.6M free.
Dec 11 04:00:43 np0005555140 systemd-sysusers[311]: Creating group 'users' with GID 100.
Dec 11 04:00:43 np0005555140 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Dec 11 04:00:43 np0005555140 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 11 04:00:43 np0005555140 systemd: Started Journal Service.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 11 04:00:43 np0005555140 systemd[1]: Starting Create Volatile Files and Directories...
Dec 11 04:00:43 np0005555140 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 11 04:00:43 np0005555140 systemd[1]: Finished Create Volatile Files and Directories.
Dec 11 04:00:43 np0005555140 systemd[1]: Finished Setup Virtual Console.
Dec 11 04:00:43 np0005555140 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting dracut cmdline hook...
Dec 11 04:00:43 np0005555140 dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Dec 11 04:00:43 np0005555140 dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 04:00:43 np0005555140 systemd[1]: Finished dracut cmdline hook.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting dracut pre-udev hook...
Dec 11 04:00:43 np0005555140 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 11 04:00:43 np0005555140 kernel: device-mapper: uevent: version 1.0.3
Dec 11 04:00:43 np0005555140 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 11 04:00:43 np0005555140 kernel: RPC: Registered named UNIX socket transport module.
Dec 11 04:00:43 np0005555140 kernel: RPC: Registered udp transport module.
Dec 11 04:00:43 np0005555140 kernel: RPC: Registered tcp transport module.
Dec 11 04:00:43 np0005555140 kernel: RPC: Registered tcp-with-tls transport module.
Dec 11 04:00:43 np0005555140 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 11 04:00:43 np0005555140 rpc.statd[444]: Version 2.5.4 starting
Dec 11 04:00:43 np0005555140 rpc.statd[444]: Initializing NSM state
Dec 11 04:00:43 np0005555140 rpc.idmapd[449]: Setting log level to 0
Dec 11 04:00:43 np0005555140 systemd[1]: Finished dracut pre-udev hook.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 11 04:00:43 np0005555140 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Dec 11 04:00:43 np0005555140 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting dracut pre-trigger hook...
Dec 11 04:00:43 np0005555140 systemd[1]: Finished dracut pre-trigger hook.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting Coldplug All udev Devices...
Dec 11 04:00:43 np0005555140 systemd[1]: Created slice Slice /system/modprobe.
Dec 11 04:00:43 np0005555140 systemd[1]: Starting Load Kernel Module configfs...
Dec 11 04:00:43 np0005555140 systemd[1]: Finished Coldplug All udev Devices.
Dec 11 04:00:43 np0005555140 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 04:00:43 np0005555140 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 04:00:43 np0005555140 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 11 04:00:43 np0005555140 systemd[1]: Reached target Network.
Dec 11 04:00:43 np0005555140 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 11 04:00:43 np0005555140 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 11 04:00:43 np0005555140 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 11 04:00:43 np0005555140 kernel: vda: vda1
Dec 11 04:00:43 np0005555140 systemd[1]: Starting dracut initqueue hook...
Dec 11 04:00:44 np0005555140 kernel: scsi host0: ata_piix
Dec 11 04:00:44 np0005555140 kernel: scsi host1: ata_piix
Dec 11 04:00:44 np0005555140 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 11 04:00:44 np0005555140 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 11 04:00:44 np0005555140 systemd[1]: Found device /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Initrd Root Device.
Dec 11 04:00:44 np0005555140 systemd[1]: Mounting Kernel Configuration File System...
Dec 11 04:00:44 np0005555140 kernel: ata1: found unknown device (class 0)
Dec 11 04:00:44 np0005555140 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 11 04:00:44 np0005555140 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 11 04:00:44 np0005555140 systemd-udevd[488]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:00:44 np0005555140 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 11 04:00:44 np0005555140 systemd[1]: Mounted Kernel Configuration File System.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target System Initialization.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Basic System.
Dec 11 04:00:44 np0005555140 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 11 04:00:44 np0005555140 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 11 04:00:44 np0005555140 systemd[1]: Finished dracut initqueue hook.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Preparation for Remote File Systems.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Remote Encrypted Volumes.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Remote File Systems.
Dec 11 04:00:44 np0005555140 systemd[1]: Starting dracut pre-mount hook...
Dec 11 04:00:44 np0005555140 systemd[1]: Finished dracut pre-mount hook.
Dec 11 04:00:44 np0005555140 systemd[1]: Starting File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266...
Dec 11 04:00:44 np0005555140 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Dec 11 04:00:44 np0005555140 systemd[1]: Finished File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 11 04:00:44 np0005555140 systemd[1]: Mounting /sysroot...
Dec 11 04:00:44 np0005555140 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 11 04:00:44 np0005555140 kernel: XFS (vda1): Mounting V5 Filesystem cbdedf45-ed1d-4952-82a8-33a12c0ba266
Dec 11 04:00:44 np0005555140 kernel: XFS (vda1): Ending clean mount
Dec 11 04:00:44 np0005555140 systemd[1]: Mounted /sysroot.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Initrd Root File System.
Dec 11 04:00:44 np0005555140 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 11 04:00:44 np0005555140 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 11 04:00:44 np0005555140 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Initrd File Systems.
Dec 11 04:00:44 np0005555140 systemd[1]: Reached target Initrd Default Target.
Dec 11 04:00:44 np0005555140 systemd[1]: Starting dracut mount hook...
Dec 11 04:00:44 np0005555140 systemd[1]: Finished dracut mount hook.
Dec 11 04:00:44 np0005555140 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 11 04:00:45 np0005555140 rpc.idmapd[449]: exiting on signal 15
Dec 11 04:00:45 np0005555140 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 11 04:00:45 np0005555140 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Network.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Timer Units.
Dec 11 04:00:45 np0005555140 systemd[1]: dbus.socket: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Initrd Default Target.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Basic System.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Initrd Root Device.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Initrd /usr File System.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Path Units.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Remote File Systems.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Slice Units.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Socket Units.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target System Initialization.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Local File Systems.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Swaps.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut mount hook.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut pre-mount hook.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped target Local Encrypted Volumes.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut initqueue hook.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Apply Kernel Variables.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Create Volatile Files and Directories.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Coldplug All udev Devices.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut pre-trigger hook.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Setup Virtual Console.
Dec 11 04:00:45 np0005555140 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Closed udev Control Socket.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Closed udev Kernel Socket.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut pre-udev hook.
Dec 11 04:00:45 np0005555140 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped dracut cmdline hook.
Dec 11 04:00:45 np0005555140 systemd[1]: Starting Cleanup udev Database...
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 11 04:00:45 np0005555140 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Create List of Static Device Nodes.
Dec 11 04:00:45 np0005555140 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Stopped Create System Users.
Dec 11 04:00:45 np0005555140 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 11 04:00:45 np0005555140 systemd[1]: Finished Cleanup udev Database.
Dec 11 04:00:45 np0005555140 systemd[1]: Reached target Switch Root.
Dec 11 04:00:45 np0005555140 systemd[1]: Starting Switch Root...
Dec 11 04:00:45 np0005555140 systemd[1]: Switching root.
Dec 11 04:00:45 np0005555140 systemd-journald[306]: Journal stopped
Dec 11 04:00:46 np0005555140 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec 11 04:00:46 np0005555140 kernel: audit: type=1404 audit(1765443645.303:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:00:46 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:00:46 np0005555140 kernel: audit: type=1403 audit(1765443645.466:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 11 04:00:46 np0005555140 systemd: Successfully loaded SELinux policy in 166.181ms.
Dec 11 04:00:46 np0005555140 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.409ms.
Dec 11 04:00:46 np0005555140 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 11 04:00:46 np0005555140 systemd: Detected virtualization kvm.
Dec 11 04:00:46 np0005555140 systemd: Detected architecture x86-64.
Dec 11 04:00:46 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:00:46 np0005555140 systemd: initrd-switch-root.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd: Stopped Switch Root.
Dec 11 04:00:46 np0005555140 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 11 04:00:46 np0005555140 systemd: Created slice Slice /system/getty.
Dec 11 04:00:46 np0005555140 systemd: Created slice Slice /system/serial-getty.
Dec 11 04:00:46 np0005555140 systemd: Created slice Slice /system/sshd-keygen.
Dec 11 04:00:46 np0005555140 systemd: Created slice User and Session Slice.
Dec 11 04:00:46 np0005555140 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec 11 04:00:46 np0005555140 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec 11 04:00:46 np0005555140 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 11 04:00:46 np0005555140 systemd: Reached target Local Encrypted Volumes.
Dec 11 04:00:46 np0005555140 systemd: Stopped target Switch Root.
Dec 11 04:00:46 np0005555140 systemd: Stopped target Initrd File Systems.
Dec 11 04:00:46 np0005555140 systemd: Stopped target Initrd Root File System.
Dec 11 04:00:46 np0005555140 systemd: Reached target Local Integrity Protected Volumes.
Dec 11 04:00:46 np0005555140 systemd: Reached target Path Units.
Dec 11 04:00:46 np0005555140 systemd: Reached target rpc_pipefs.target.
Dec 11 04:00:46 np0005555140 systemd: Reached target Slice Units.
Dec 11 04:00:46 np0005555140 systemd: Reached target Swaps.
Dec 11 04:00:46 np0005555140 systemd: Reached target Local Verity Protected Volumes.
Dec 11 04:00:46 np0005555140 systemd: Listening on RPCbind Server Activation Socket.
Dec 11 04:00:46 np0005555140 systemd: Reached target RPC Port Mapper.
Dec 11 04:00:46 np0005555140 systemd: Listening on Process Core Dump Socket.
Dec 11 04:00:46 np0005555140 systemd: Listening on initctl Compatibility Named Pipe.
Dec 11 04:00:46 np0005555140 systemd: Listening on udev Control Socket.
Dec 11 04:00:46 np0005555140 systemd: Listening on udev Kernel Socket.
Dec 11 04:00:46 np0005555140 systemd: Mounting Huge Pages File System...
Dec 11 04:00:46 np0005555140 systemd: Mounting POSIX Message Queue File System...
Dec 11 04:00:46 np0005555140 systemd: Mounting Kernel Debug File System...
Dec 11 04:00:46 np0005555140 systemd: Mounting Kernel Trace File System...
Dec 11 04:00:46 np0005555140 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 11 04:00:46 np0005555140 systemd: Starting Create List of Static Device Nodes...
Dec 11 04:00:46 np0005555140 systemd: Starting Load Kernel Module configfs...
Dec 11 04:00:46 np0005555140 systemd: Starting Load Kernel Module drm...
Dec 11 04:00:46 np0005555140 systemd: Starting Load Kernel Module efi_pstore...
Dec 11 04:00:46 np0005555140 systemd: Starting Load Kernel Module fuse...
Dec 11 04:00:46 np0005555140 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 11 04:00:46 np0005555140 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd: Stopped File System Check on Root Device.
Dec 11 04:00:46 np0005555140 systemd: Stopped Journal Service.
Dec 11 04:00:46 np0005555140 kernel: fuse: init (API version 7.37)
Dec 11 04:00:46 np0005555140 systemd: Starting Journal Service...
Dec 11 04:00:46 np0005555140 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 11 04:00:46 np0005555140 systemd: Starting Generate network units from Kernel command line...
Dec 11 04:00:46 np0005555140 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 04:00:46 np0005555140 systemd: Starting Remount Root and Kernel File Systems...
Dec 11 04:00:46 np0005555140 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 11 04:00:46 np0005555140 systemd: Starting Apply Kernel Variables...
Dec 11 04:00:46 np0005555140 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 11 04:00:46 np0005555140 systemd: Starting Coldplug All udev Devices...
Dec 11 04:00:46 np0005555140 systemd-journald[678]: Journal started
Dec 11 04:00:46 np0005555140 systemd-journald[678]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 11 04:00:46 np0005555140 systemd[1]: Queued start job for default target Multi-User System.
Dec 11 04:00:46 np0005555140 systemd: Mounted Huge Pages File System.
Dec 11 04:00:46 np0005555140 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd: Started Journal Service.
Dec 11 04:00:46 np0005555140 systemd[1]: Mounted POSIX Message Queue File System.
Dec 11 04:00:46 np0005555140 kernel: ACPI: bus type drm_connector registered
Dec 11 04:00:46 np0005555140 systemd[1]: Mounted Kernel Debug File System.
Dec 11 04:00:46 np0005555140 systemd[1]: Mounted Kernel Trace File System.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Create List of Static Device Nodes.
Dec 11 04:00:46 np0005555140 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 04:00:46 np0005555140 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load Kernel Module drm.
Dec 11 04:00:46 np0005555140 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 11 04:00:46 np0005555140 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load Kernel Module fuse.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Generate network units from Kernel command line.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Apply Kernel Variables.
Dec 11 04:00:46 np0005555140 systemd[1]: Mounting FUSE Control File System...
Dec 11 04:00:46 np0005555140 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Rebuild Hardware Database...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 11 04:00:46 np0005555140 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Load/Save OS Random Seed...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Create System Users...
Dec 11 04:00:46 np0005555140 systemd-journald[678]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 11 04:00:46 np0005555140 systemd-journald[678]: Received client request to flush runtime journal.
Dec 11 04:00:46 np0005555140 systemd[1]: Mounted FUSE Control File System.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load/Save OS Random Seed.
Dec 11 04:00:46 np0005555140 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Create System Users.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Coldplug All udev Devices.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target Preparation for Local File Systems.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target Local File Systems.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 11 04:00:46 np0005555140 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 11 04:00:46 np0005555140 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 11 04:00:46 np0005555140 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Automatic Boot Loader Update...
Dec 11 04:00:46 np0005555140 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Create Volatile Files and Directories...
Dec 11 04:00:46 np0005555140 bootctl[696]: Couldn't find EFI system partition, skipping.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Automatic Boot Loader Update.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Create Volatile Files and Directories.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Security Auditing Service...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting RPC Bind...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Rebuild Journal Catalog...
Dec 11 04:00:46 np0005555140 auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 11 04:00:46 np0005555140 auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 11 04:00:46 np0005555140 systemd[1]: Started RPC Bind.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Rebuild Journal Catalog.
Dec 11 04:00:46 np0005555140 augenrules[707]: /sbin/augenrules: No change
Dec 11 04:00:46 np0005555140 augenrules[722]: No rules
Dec 11 04:00:46 np0005555140 augenrules[722]: enabled 1
Dec 11 04:00:46 np0005555140 augenrules[722]: failure 1
Dec 11 04:00:46 np0005555140 augenrules[722]: pid 702
Dec 11 04:00:46 np0005555140 augenrules[722]: rate_limit 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_limit 8192
Dec 11 04:00:46 np0005555140 augenrules[722]: lost 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog 4
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time 60000
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time_actual 0
Dec 11 04:00:46 np0005555140 augenrules[722]: enabled 1
Dec 11 04:00:46 np0005555140 augenrules[722]: failure 1
Dec 11 04:00:46 np0005555140 augenrules[722]: pid 702
Dec 11 04:00:46 np0005555140 augenrules[722]: rate_limit 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_limit 8192
Dec 11 04:00:46 np0005555140 augenrules[722]: lost 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog 1
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time 60000
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time_actual 0
Dec 11 04:00:46 np0005555140 augenrules[722]: enabled 1
Dec 11 04:00:46 np0005555140 augenrules[722]: failure 1
Dec 11 04:00:46 np0005555140 augenrules[722]: pid 702
Dec 11 04:00:46 np0005555140 augenrules[722]: rate_limit 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_limit 8192
Dec 11 04:00:46 np0005555140 augenrules[722]: lost 0
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog 1
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time 60000
Dec 11 04:00:46 np0005555140 augenrules[722]: backlog_wait_time_actual 0
Dec 11 04:00:46 np0005555140 systemd[1]: Started Security Auditing Service.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Rebuild Hardware Database.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Update is Completed...
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Update is Completed.
Dec 11 04:00:46 np0005555140 systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Dec 11 04:00:46 np0005555140 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target System Initialization.
Dec 11 04:00:46 np0005555140 systemd[1]: Started dnf makecache --timer.
Dec 11 04:00:46 np0005555140 systemd[1]: Started Daily rotation of log files.
Dec 11 04:00:46 np0005555140 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target Timer Units.
Dec 11 04:00:46 np0005555140 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 11 04:00:46 np0005555140 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target Socket Units.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting D-Bus System Message Bus...
Dec 11 04:00:46 np0005555140 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 04:00:46 np0005555140 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Load Kernel Module configfs...
Dec 11 04:00:46 np0005555140 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 04:00:46 np0005555140 systemd-udevd[745]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:00:46 np0005555140 systemd[1]: Started D-Bus System Message Bus.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target Basic System.
Dec 11 04:00:46 np0005555140 dbus-broker-lau[756]: Ready
Dec 11 04:00:46 np0005555140 systemd[1]: Starting NTP client/server...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 11 04:00:46 np0005555140 systemd[1]: Starting IPv4 firewall with iptables...
Dec 11 04:00:46 np0005555140 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 11 04:00:46 np0005555140 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 11 04:00:46 np0005555140 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 11 04:00:46 np0005555140 systemd[1]: Started irqbalance daemon.
Dec 11 04:00:46 np0005555140 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 11 04:00:46 np0005555140 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 11 04:00:46 np0005555140 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:00:46 np0005555140 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:00:46 np0005555140 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target sshd-keygen.target.
Dec 11 04:00:46 np0005555140 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 11 04:00:46 np0005555140 systemd[1]: Reached target User and Group Name Lookups.
Dec 11 04:00:46 np0005555140 systemd[1]: Starting User Login Management...
Dec 11 04:00:46 np0005555140 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 11 04:00:46 np0005555140 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 11 04:00:46 np0005555140 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 11 04:00:46 np0005555140 kernel: Console: switching to colour dummy device 80x25
Dec 11 04:00:46 np0005555140 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 11 04:00:46 np0005555140 kernel: [drm] features: -context_init
Dec 11 04:00:46 np0005555140 kernel: [drm] number of scanouts: 1
Dec 11 04:00:46 np0005555140 kernel: [drm] number of cap sets: 0
Dec 11 04:00:46 np0005555140 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 11 04:00:46 np0005555140 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 11 04:00:46 np0005555140 kernel: Console: switching to colour frame buffer device 128x48
Dec 11 04:00:46 np0005555140 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 11 04:00:46 np0005555140 chronyd[795]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 11 04:00:46 np0005555140 chronyd[795]: Loaded 0 symmetric keys
Dec 11 04:00:46 np0005555140 chronyd[795]: Using right/UTC timezone to obtain leap second data
Dec 11 04:00:46 np0005555140 chronyd[795]: Loaded seccomp filter (level 2)
Dec 11 04:00:46 np0005555140 systemd[1]: Started NTP client/server.
Dec 11 04:00:47 np0005555140 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 11 04:00:47 np0005555140 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 11 04:00:47 np0005555140 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 11 04:00:47 np0005555140 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 11 04:00:47 np0005555140 systemd-logind[787]: New seat seat0.
Dec 11 04:00:47 np0005555140 systemd[1]: Started User Login Management.
Dec 11 04:00:47 np0005555140 kernel: kvm_amd: TSC scaling supported
Dec 11 04:00:47 np0005555140 kernel: kvm_amd: Nested Virtualization enabled
Dec 11 04:00:47 np0005555140 kernel: kvm_amd: Nested Paging enabled
Dec 11 04:00:47 np0005555140 kernel: kvm_amd: LBR virtualization supported
Dec 11 04:00:47 np0005555140 iptables.init[780]: iptables: Applying firewall rules: [  OK  ]
Dec 11 04:00:47 np0005555140 systemd[1]: Finished IPv4 firewall with iptables.
Dec 11 04:00:47 np0005555140 cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 11 Dec 2025 09:00:47 +0000. Up 5.97 seconds.
Dec 11 04:00:47 np0005555140 systemd[1]: run-cloud\x2dinit-tmp-tmp2sz21gq3.mount: Deactivated successfully.
Dec 11 04:00:47 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 04:00:47 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 04:00:47 np0005555140 systemd-hostnamed[852]: Hostname set to <np0005555140.novalocal> (static)
Dec 11 04:00:47 np0005555140 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 11 04:00:47 np0005555140 systemd[1]: Reached target Preparation for Network.
Dec 11 04:00:47 np0005555140 systemd[1]: Starting Network Manager...
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9219] NetworkManager (version 1.54.2-1.el9) is starting... (boot:44e8beb3-23e7-4e74-aa29-1a4573300217)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9224] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9305] manager[0x55d9e24c8000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9341] hostname: hostname: using hostnamed
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9341] hostname: static hostname changed from (none) to "np0005555140.novalocal"
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9346] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9483] manager[0x55d9e24c8000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9484] manager[0x55d9e24c8000]: rfkill: WWAN hardware radio set enabled
Dec 11 04:00:47 np0005555140 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9559] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9559] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9560] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9560] manager: Networking is enabled by state file
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9562] settings: Loaded settings plugin: keyfile (internal)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9571] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9592] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9605] dhcp: init: Using DHCP client 'internal'
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9608] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9622] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9629] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9636] device (lo): Activation: starting connection 'lo' (d8ac9655-2a62-4b5a-8019-334c588c1842)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9645] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9649] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9675] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9680] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9682] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9684] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9686] device (eth0): carrier: link connected
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9690] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9695] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9701] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9705] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9705] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9707] manager: NetworkManager state is now CONNECTING
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9708] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9715] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9718] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:00:47 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9766] dhcp4 (eth0): state changed new lease, address=38.102.83.70
Dec 11 04:00:47 np0005555140 systemd[1]: Started Network Manager.
Dec 11 04:00:47 np0005555140 systemd[1]: Reached target Network.
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9788] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9824] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 systemd[1]: Starting Network Manager Wait Online...
Dec 11 04:00:47 np0005555140 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9899] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9900] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9902] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9909] device (lo): Activation: successful, device activated.
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9916] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9920] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9923] device (eth0): Activation: successful, device activated.
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9930] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 04:00:47 np0005555140 NetworkManager[856]: <info>  [1765443647.9933] manager: startup complete
Dec 11 04:00:47 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:00:48 np0005555140 systemd[1]: Started GSSAPI Proxy Daemon.
Dec 11 04:00:48 np0005555140 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 11 04:00:48 np0005555140 systemd[1]: Reached target NFS client services.
Dec 11 04:00:48 np0005555140 systemd[1]: Reached target Preparation for Remote File Systems.
Dec 11 04:00:48 np0005555140 systemd[1]: Reached target Remote File Systems.
Dec 11 04:00:48 np0005555140 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 04:00:48 np0005555140 systemd[1]: Finished Network Manager Wait Online.
Dec 11 04:00:48 np0005555140 systemd[1]: Starting Cloud-init: Network Stage...
Dec 11 04:00:48 np0005555140 cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 11 Dec 2025 09:00:48 +0000. Up 6.93 seconds.
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |  eth0  | True |         38.102.83.70         | 255.255.255.0 | global | fa:16:3e:7e:29:91 |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe7e:2991/64 |       .       |  link  | fa:16:3e:7e:29:91 |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 11 04:00:48 np0005555140 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 04:00:49 np0005555140 cloud-init[920]: Generating public/private rsa key pair.
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key fingerprint is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: SHA256:19OyG5KQylLTp0d4l9sOsfdps8eTbgR8FfSFM1cstDo root@np0005555140.novalocal
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key's randomart image is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: +---[RSA 3072]----+
Dec 11 04:00:49 np0005555140 cloud-init[920]: |             .o==|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |              =.*|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |            . .=o|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |       . o . * . |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |      o S = E +  |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |     o o B o X . |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |    . o . + * +..|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |     .   . . * B+|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |            . =+*|
Dec 11 04:00:49 np0005555140 cloud-init[920]: +----[SHA256]-----+
Dec 11 04:00:49 np0005555140 cloud-init[920]: Generating public/private ecdsa key pair.
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key fingerprint is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: SHA256:XlTKQKxW7Jbmgv8MvFZT1mrvEhqnUNr5aOtCGlJI6YY root@np0005555140.novalocal
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key's randomart image is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: +---[ECDSA 256]---+
Dec 11 04:00:49 np0005555140 cloud-init[920]: |   .   +o   .    |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |  o     +o o     |
Dec 11 04:00:49 np0005555140 cloud-init[920]: | + .   + .+.     |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |E + . o *.o .    |
Dec 11 04:00:49 np0005555140 cloud-init[920]: | . . o BS+..     |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |  . o.=.B.=      |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |   . =o+.X o     |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |    . ++= o .    |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |     ..==. o.    |
Dec 11 04:00:49 np0005555140 cloud-init[920]: +----[SHA256]-----+
Dec 11 04:00:49 np0005555140 cloud-init[920]: Generating public/private ed25519 key pair.
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 11 04:00:49 np0005555140 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key fingerprint is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: SHA256:5JBQpquW11c3ufXa3CObZgZReGthXp8b3Q4CioycBKY root@np0005555140.novalocal
Dec 11 04:00:49 np0005555140 cloud-init[920]: The key's randomart image is:
Dec 11 04:00:49 np0005555140 cloud-init[920]: +--[ED25519 256]--+
Dec 11 04:00:49 np0005555140 cloud-init[920]: |  o...o     .    |
Dec 11 04:00:49 np0005555140 cloud-init[920]: | o  .+ .  .. = . |
Dec 11 04:00:49 np0005555140 cloud-init[920]: |E  o.+o... .= + =|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |    +.o+.  ..=.o+|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |    .   S . *..oo|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |   o .   . o + o.|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |  + . . .   o   .|
Dec 11 04:00:49 np0005555140 cloud-init[920]: | . .   .     =.=.|
Dec 11 04:00:49 np0005555140 cloud-init[920]: |            +o+ =|
Dec 11 04:00:49 np0005555140 cloud-init[920]: +----[SHA256]-----+
Dec 11 04:00:49 np0005555140 systemd[1]: Finished Cloud-init: Network Stage.
Dec 11 04:00:49 np0005555140 systemd[1]: Reached target Cloud-config availability.
Dec 11 04:00:49 np0005555140 systemd[1]: Reached target Network is Online.
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Cloud-init: Config Stage...
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Crash recovery kernel arming...
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Notify NFS peers of a restart...
Dec 11 04:00:49 np0005555140 systemd[1]: Starting System Logging Service...
Dec 11 04:00:49 np0005555140 sm-notify[1004]: Version 2.5.4 starting
Dec 11 04:00:49 np0005555140 systemd[1]: Starting OpenSSH server daemon...
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Permit User Sessions...
Dec 11 04:00:49 np0005555140 systemd[1]: Started Notify NFS peers of a restart.
Dec 11 04:00:49 np0005555140 systemd[1]: Started OpenSSH server daemon.
Dec 11 04:00:49 np0005555140 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Dec 11 04:00:49 np0005555140 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 11 04:00:49 np0005555140 systemd[1]: Started System Logging Service.
Dec 11 04:00:49 np0005555140 systemd[1]: Finished Permit User Sessions.
Dec 11 04:00:49 np0005555140 systemd[1]: Started Command Scheduler.
Dec 11 04:00:49 np0005555140 systemd[1]: Started Getty on tty1.
Dec 11 04:00:49 np0005555140 systemd[1]: Started Serial Getty on ttyS0.
Dec 11 04:00:49 np0005555140 systemd[1]: Reached target Login Prompts.
Dec 11 04:00:49 np0005555140 systemd[1]: Reached target Multi-User System.
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 11 04:00:49 np0005555140 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 11 04:00:49 np0005555140 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 11 04:00:49 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:00:49 np0005555140 kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec 11 04:00:49 np0005555140 kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-648.el9.x86_64kdump.img
Dec 11 04:00:49 np0005555140 cloud-init[1125]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 11 Dec 2025 09:00:49 +0000. Up 8.48 seconds.
Dec 11 04:00:49 np0005555140 systemd[1]: Finished Cloud-init: Config Stage.
Dec 11 04:00:49 np0005555140 systemd[1]: Starting Cloud-init: Final Stage...
Dec 11 04:00:50 np0005555140 dracut[1266]: dracut-057-102.git20250818.el9
Dec 11 04:00:50 np0005555140 cloud-init[1284]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 11 Dec 2025 09:00:50 +0000. Up 8.89 seconds.
Dec 11 04:00:50 np0005555140 cloud-init[1288]: #############################################################
Dec 11 04:00:50 np0005555140 cloud-init[1290]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 11 04:00:50 np0005555140 cloud-init[1298]: 256 SHA256:XlTKQKxW7Jbmgv8MvFZT1mrvEhqnUNr5aOtCGlJI6YY root@np0005555140.novalocal (ECDSA)
Dec 11 04:00:50 np0005555140 dracut[1268]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-648.el9.x86_64kdump.img 5.14.0-648.el9.x86_64
Dec 11 04:00:50 np0005555140 cloud-init[1304]: 256 SHA256:5JBQpquW11c3ufXa3CObZgZReGthXp8b3Q4CioycBKY root@np0005555140.novalocal (ED25519)
Dec 11 04:00:50 np0005555140 cloud-init[1311]: 3072 SHA256:19OyG5KQylLTp0d4l9sOsfdps8eTbgR8FfSFM1cstDo root@np0005555140.novalocal (RSA)
Dec 11 04:00:50 np0005555140 cloud-init[1313]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 11 04:00:50 np0005555140 cloud-init[1315]: #############################################################
Dec 11 04:00:50 np0005555140 cloud-init[1284]: Cloud-init v. 24.4-7.el9 finished at Thu, 11 Dec 2025 09:00:50 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.06 seconds
Dec 11 04:00:50 np0005555140 systemd[1]: Finished Cloud-init: Final Stage.
Dec 11 04:00:50 np0005555140 systemd[1]: Reached target Cloud-init target.
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 11 04:00:50 np0005555140 dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: memstrack is not available
Dec 11 04:00:51 np0005555140 dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 11 04:00:51 np0005555140 dracut[1268]: memstrack is not available
Dec 11 04:00:51 np0005555140 dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 11 04:00:51 np0005555140 dracut[1268]: *** Including module: systemd ***
Dec 11 04:00:52 np0005555140 dracut[1268]: *** Including module: fips ***
Dec 11 04:00:52 np0005555140 dracut[1268]: *** Including module: systemd-initrd ***
Dec 11 04:00:52 np0005555140 dracut[1268]: *** Including module: i18n ***
Dec 11 04:00:52 np0005555140 dracut[1268]: *** Including module: drm ***
Dec 11 04:00:52 np0005555140 chronyd[795]: Selected source 206.108.0.131 (2.centos.pool.ntp.org)
Dec 11 04:00:52 np0005555140 chronyd[795]: System clock TAI offset set to 37 seconds
Dec 11 04:00:53 np0005555140 dracut[1268]: *** Including module: prefixdevname ***
Dec 11 04:00:53 np0005555140 dracut[1268]: *** Including module: kernel-modules ***
Dec 11 04:00:53 np0005555140 kernel: block vda: the capability attribute has been deprecated.
Dec 11 04:00:53 np0005555140 dracut[1268]: *** Including module: kernel-modules-extra ***
Dec 11 04:00:53 np0005555140 dracut[1268]: *** Including module: qemu ***
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: fstab-sys ***
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: rootfs-block ***
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: terminfo ***
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: udev-rules ***
Dec 11 04:00:54 np0005555140 dracut[1268]: Skipping udev rule: 91-permissions.rules
Dec 11 04:00:54 np0005555140 dracut[1268]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: virtiofs ***
Dec 11 04:00:54 np0005555140 dracut[1268]: *** Including module: dracut-systemd ***
Dec 11 04:00:55 np0005555140 dracut[1268]: *** Including module: usrmount ***
Dec 11 04:00:55 np0005555140 dracut[1268]: *** Including module: base ***
Dec 11 04:00:55 np0005555140 dracut[1268]: *** Including module: fs-lib ***
Dec 11 04:00:55 np0005555140 dracut[1268]: *** Including module: kdumpbase ***
Dec 11 04:00:55 np0005555140 dracut[1268]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 11 04:00:55 np0005555140 dracut[1268]:  microcode_ctl module: mangling fw_dir
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 11 04:00:55 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 11 04:00:56 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 11 04:00:56 np0005555140 dracut[1268]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 11 04:00:56 np0005555140 dracut[1268]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 11 04:00:56 np0005555140 dracut[1268]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 11 04:00:56 np0005555140 dracut[1268]: *** Including module: openssl ***
Dec 11 04:00:56 np0005555140 dracut[1268]: *** Including module: shutdown ***
Dec 11 04:00:56 np0005555140 dracut[1268]: *** Including module: squash ***
Dec 11 04:00:56 np0005555140 dracut[1268]: *** Including modules done ***
Dec 11 04:00:56 np0005555140 dracut[1268]: *** Installing kernel module dependencies ***
Dec 11 04:00:57 np0005555140 dracut[1268]: *** Installing kernel module dependencies done ***
Dec 11 04:00:57 np0005555140 dracut[1268]: *** Resolving executable dependencies ***
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 25 affinity is now unmanaged
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 31 affinity is now unmanaged
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 28 affinity is now unmanaged
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 32 affinity is now unmanaged
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 30 affinity is now unmanaged
Dec 11 04:00:57 np0005555140 irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 11 04:00:57 np0005555140 irqbalance[781]: IRQ 29 affinity is now unmanaged
Dec 11 04:00:58 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:00:58 np0005555140 dracut[1268]: *** Resolving executable dependencies done ***
Dec 11 04:00:58 np0005555140 dracut[1268]: *** Generating early-microcode cpio image ***
Dec 11 04:00:58 np0005555140 dracut[1268]: *** Store current command line parameters ***
Dec 11 04:00:58 np0005555140 dracut[1268]: Stored kernel commandline:
Dec 11 04:00:58 np0005555140 dracut[1268]: No dracut internal kernel commandline stored in the initramfs
Dec 11 04:00:58 np0005555140 dracut[1268]: *** Install squash loader ***
Dec 11 04:00:59 np0005555140 dracut[1268]: *** Squashing the files inside the initramfs ***
Dec 11 04:01:00 np0005555140 dracut[1268]: *** Squashing the files inside the initramfs done ***
Dec 11 04:01:00 np0005555140 dracut[1268]: *** Creating image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' ***
Dec 11 04:01:00 np0005555140 dracut[1268]: *** Hardlinking files ***
Dec 11 04:01:01 np0005555140 dracut[1268]: *** Hardlinking files done ***
Dec 11 04:01:01 np0005555140 dracut[1268]: *** Creating initramfs image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' done ***
Dec 11 04:01:01 np0005555140 kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec 11 04:01:01 np0005555140 kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec 11 04:01:01 np0005555140 systemd[1]: Finished Crash recovery kernel arming.
Dec 11 04:01:01 np0005555140 systemd[1]: Startup finished in 1.568s (kernel) + 2.358s (initrd) + 16.545s (userspace) = 20.472s.
Dec 11 04:01:03 np0005555140 systemd[1]: Created slice User Slice of UID 1000.
Dec 11 04:01:03 np0005555140 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 11 04:01:03 np0005555140 systemd-logind[787]: New session 1 of user zuul.
Dec 11 04:01:03 np0005555140 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 11 04:01:03 np0005555140 systemd[1]: Starting User Manager for UID 1000...
Dec 11 04:01:03 np0005555140 systemd[4315]: Queued start job for default target Main User Target.
Dec 11 04:01:03 np0005555140 systemd[4315]: Created slice User Application Slice.
Dec 11 04:01:03 np0005555140 systemd[4315]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 11 04:01:03 np0005555140 systemd[4315]: Started Daily Cleanup of User's Temporary Directories.
Dec 11 04:01:03 np0005555140 systemd[4315]: Reached target Paths.
Dec 11 04:01:03 np0005555140 systemd[4315]: Reached target Timers.
Dec 11 04:01:03 np0005555140 systemd[4315]: Starting D-Bus User Message Bus Socket...
Dec 11 04:01:03 np0005555140 systemd[4315]: Starting Create User's Volatile Files and Directories...
Dec 11 04:01:03 np0005555140 systemd[4315]: Listening on D-Bus User Message Bus Socket.
Dec 11 04:01:03 np0005555140 systemd[4315]: Reached target Sockets.
Dec 11 04:01:03 np0005555140 systemd[4315]: Finished Create User's Volatile Files and Directories.
Dec 11 04:01:03 np0005555140 systemd[4315]: Reached target Basic System.
Dec 11 04:01:03 np0005555140 systemd[4315]: Reached target Main User Target.
Dec 11 04:01:03 np0005555140 systemd[4315]: Startup finished in 105ms.
Dec 11 04:01:03 np0005555140 systemd[1]: Started User Manager for UID 1000.
Dec 11 04:01:03 np0005555140 systemd[1]: Started Session 1 of User zuul.
Dec 11 04:01:04 np0005555140 python3[4397]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:01:06 np0005555140 python3[4425]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:01:12 np0005555140 python3[4483]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:01:13 np0005555140 python3[4523]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 11 04:01:15 np0005555140 python3[4549]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMV/tZkqg8v07RAMZga0dPh1OTvdLcf+UyaTgaV8MpKoG4yPfwpsQ65buGNCvUOB7wYjbcLI+a0Z9GMytppQijVS4gkjfKSq9WloGDqEgFlix9qq3qcUfz9Sazexdh1ShTBO9CMs3dVi4GC2OtHxyGMwBSjBVM4tEPxJ/iBN2AI4GvLvXhr/kSepTB8wqbG3eDApVqArWbw38sdl6YGmuGWJ6CMv/XxAovuqp6YPCj2acFjqHVhpAmxpFy8P/+FH28kUnKbijMeAjXGPN7v4gefEM5Ics9+PaBc3tUCXZ9UDA13F08JcpiOmkOD9dSwiZf/kZUEOFfqlbbda/cEqVWkwcbjVMd+yISG0HWdp5tnlkt/L/HkcDzituiKDXEogWoVRp5siijgmgTcra3oQgFekZ7tiUFpk8bgJlUcJKp/Dzu/D97hU3mi/fUhb5gm5OxrJ5Mp5dzVrIdNASXPGF1yhiEMcluKap/wMfl0GTfJFiBWyAie1pZ1N4AaEuc7mE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:15 np0005555140 python3[4573]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:15 np0005555140 python3[4672]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:16 np0005555140 python3[4743]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765443675.6641057-207-102813676274198/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=f8cf6339bc6d44f6b54e03f91c7a3ca3_id_rsa follow=False checksum=2fb320be04996ec83bf6a68fae832862429ced08 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:16 np0005555140 python3[4866]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:17 np0005555140 python3[4937]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765443676.5745268-240-273629068756176/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=f8cf6339bc6d44f6b54e03f91c7a3ca3_id_rsa.pub follow=False checksum=b214838bf8b2762e0460540b415c3cd18ed91ed4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:17 np0005555140 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 04:01:18 np0005555140 python3[4987]: ansible-ping Invoked with data=pong
Dec 11 04:01:19 np0005555140 python3[5011]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:01:21 np0005555140 python3[5069]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 11 04:01:22 np0005555140 python3[5101]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:22 np0005555140 python3[5125]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:22 np0005555140 python3[5149]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:23 np0005555140 python3[5173]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:23 np0005555140 python3[5197]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:23 np0005555140 python3[5221]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:25 np0005555140 python3[5247]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:25 np0005555140 python3[5325]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:26 np0005555140 python3[5398]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765443685.2762022-21-4525344421485/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:26 np0005555140 python3[5446]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:27 np0005555140 python3[5470]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:27 np0005555140 python3[5494]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:27 np0005555140 python3[5518]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:27 np0005555140 python3[5542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:28 np0005555140 python3[5566]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:28 np0005555140 python3[5590]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:28 np0005555140 python3[5614]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:29 np0005555140 python3[5638]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:29 np0005555140 python3[5662]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:29 np0005555140 python3[5686]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:29 np0005555140 python3[5710]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:30 np0005555140 python3[5734]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:30 np0005555140 python3[5758]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:30 np0005555140 python3[5782]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:31 np0005555140 python3[5806]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:31 np0005555140 python3[5830]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:31 np0005555140 python3[5854]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:31 np0005555140 python3[5878]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:32 np0005555140 python3[5902]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:32 np0005555140 python3[5926]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:32 np0005555140 python3[5950]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:33 np0005555140 python3[5974]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:33 np0005555140 python3[5998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:33 np0005555140 python3[6022]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:34 np0005555140 python3[6046]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:01:36 np0005555140 python3[6072]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 11 04:01:36 np0005555140 systemd[1]: Starting Time & Date Service...
Dec 11 04:01:36 np0005555140 systemd[1]: Started Time & Date Service.
Dec 11 04:01:36 np0005555140 systemd-timedated[6074]: Changed time zone to 'UTC' (UTC).
Dec 11 04:01:37 np0005555140 python3[6103]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:37 np0005555140 python3[6179]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:38 np0005555140 python3[6250]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765443697.533706-153-209380047991619/source _original_basename=tmpjz3ai7cb follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:38 np0005555140 python3[6350]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:38 np0005555140 python3[6421]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765443698.3789577-183-257242562198531/source _original_basename=tmpp6rze3dj follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:39 np0005555140 python3[6523]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:40 np0005555140 python3[6596]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765443699.526768-231-176970972349406/source _original_basename=tmpo2j9c432 follow=False checksum=18e69b4e7a766afddcd5db28cd6f47889284b7a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:40 np0005555140 python3[6644]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:01:40 np0005555140 python3[6670]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:01:41 np0005555140 python3[6750]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:01:41 np0005555140 python3[6823]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765443701.1087732-273-277980312304078/source _original_basename=tmpyndfi1el follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:01:42 np0005555140 python3[6874]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-767d-5e3b-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:01:42 np0005555140 python3[6902]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-767d-5e3b-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 11 04:01:54 np0005555140 python3[6931]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:02:06 np0005555140 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 04:02:20 np0005555140 python3[6959]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 11 04:02:57 np0005555140 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 11 04:02:57 np0005555140 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0362] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 04:02:58 np0005555140 systemd-udevd[6961]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0562] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0595] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0599] device (eth1): carrier: link connected
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0602] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0610] policy: auto-activating connection 'Wired connection 1' (bf1dd5df-db4b-356a-83a4-08676bcdd19e)
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0614] device (eth1): Activation: starting connection 'Wired connection 1' (bf1dd5df-db4b-356a-83a4-08676bcdd19e)
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0615] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0617] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0622] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:02:58 np0005555140 NetworkManager[856]: <info>  [1765443778.0626] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:02:58 np0005555140 python3[6987]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-47e3-76d4-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:03:05 np0005555140 python3[7067]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:03:06 np0005555140 python3[7140]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765443785.5394487-102-118202512214839/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c7a2c75beb7e370af580c18d2e8c313fcd5931b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:03:07 np0005555140 python3[7190]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:03:07 np0005555140 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 11 04:03:07 np0005555140 systemd[1]: Stopped Network Manager Wait Online.
Dec 11 04:03:07 np0005555140 systemd[1]: Stopping Network Manager Wait Online...
Dec 11 04:03:07 np0005555140 systemd[1]: Stopping Network Manager...
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.1911] caught SIGTERM, shutting down normally.
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.1921] dhcp4 (eth0): canceled DHCP transaction
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.1922] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.1922] dhcp4 (eth0): state changed no lease
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.1926] manager: NetworkManager state is now CONNECTING
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.2029] dhcp4 (eth1): canceled DHCP transaction
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.2029] dhcp4 (eth1): state changed no lease
Dec 11 04:03:07 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:03:07 np0005555140 NetworkManager[856]: <info>  [1765443787.2084] exiting (success)
Dec 11 04:03:07 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:03:07 np0005555140 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 11 04:03:07 np0005555140 systemd[1]: Stopped Network Manager.
Dec 11 04:03:07 np0005555140 systemd[1]: Starting Network Manager...
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.2672] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:44e8beb3-23e7-4e74-aa29-1a4573300217)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.2676] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.2729] manager[0x55cd29c1c000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 04:03:07 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 04:03:07 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3582] hostname: hostname: using hostnamed
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3583] hostname: static hostname changed from (none) to "np0005555140.novalocal"
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3587] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3591] manager[0x55cd29c1c000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3592] manager[0x55cd29c1c000]: rfkill: WWAN hardware radio set enabled
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3616] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3617] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3617] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3617] manager: Networking is enabled by state file
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3620] settings: Loaded settings plugin: keyfile (internal)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3623] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3647] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3657] dhcp: init: Using DHCP client 'internal'
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3660] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3667] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3671] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3679] device (lo): Activation: starting connection 'lo' (d8ac9655-2a62-4b5a-8019-334c588c1842)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3685] device (eth0): carrier: link connected
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3689] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3694] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3695] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3700] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3707] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3714] device (eth1): carrier: link connected
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3717] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3722] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (bf1dd5df-db4b-356a-83a4-08676bcdd19e) (indicated)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3723] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3727] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3733] device (eth1): Activation: starting connection 'Wired connection 1' (bf1dd5df-db4b-356a-83a4-08676bcdd19e)
Dec 11 04:03:07 np0005555140 systemd[1]: Started Network Manager.
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3746] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3751] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3753] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3755] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3758] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3761] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3763] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3766] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3770] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3777] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3781] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3790] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3793] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3805] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3811] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3817] device (lo): Activation: successful, device activated.
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3823] dhcp4 (eth0): state changed new lease, address=38.102.83.70
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3830] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3883] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 systemd[1]: Starting Network Manager Wait Online...
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3919] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3923] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3926] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3929] device (eth0): Activation: successful, device activated.
Dec 11 04:03:07 np0005555140 NetworkManager[7201]: <info>  [1765443787.3934] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 04:03:07 np0005555140 python3[7274]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-47e3-76d4-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:03:17 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:03:37 np0005555140 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.3760] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 04:03:52 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:03:52 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4088] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4090] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4093] device (eth1): Activation: successful, device activated.
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4099] manager: startup complete
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4100] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <warn>  [1765443832.4104] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4110] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 systemd[1]: Finished Network Manager Wait Online.
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4249] dhcp4 (eth1): canceled DHCP transaction
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4249] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4249] dhcp4 (eth1): state changed no lease
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4262] policy: auto-activating connection 'ci-private-network' (f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b)
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4265] device (eth1): Activation: starting connection 'ci-private-network' (f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b)
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4266] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4269] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4276] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4285] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4322] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4323] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:03:52 np0005555140 NetworkManager[7201]: <info>  [1765443832.4328] device (eth1): Activation: successful, device activated.
Dec 11 04:04:01 np0005555140 systemd[4315]: Starting Mark boot as successful...
Dec 11 04:04:01 np0005555140 systemd[4315]: Finished Mark boot as successful.
Dec 11 04:04:02 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:04:04 np0005555140 python3[7380]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:04:04 np0005555140 python3[7453]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765443844.1901712-259-140155976571563/source _original_basename=tmpsy5geue1 follow=False checksum=279d23ae00b83df2272ee2e851c2c7de46dd8675 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:05:05 np0005555140 systemd-logind[787]: Session 1 logged out. Waiting for processes to exit.
Dec 11 04:07:01 np0005555140 systemd[4315]: Created slice User Background Tasks Slice.
Dec 11 04:07:01 np0005555140 systemd[4315]: Starting Cleanup of User's Temporary Files and Directories...
Dec 11 04:07:01 np0005555140 systemd[4315]: Finished Cleanup of User's Temporary Files and Directories.
Dec 11 04:09:37 np0005555140 systemd-logind[787]: New session 3 of user zuul.
Dec 11 04:09:37 np0005555140 systemd[1]: Started Session 3 of User zuul.
Dec 11 04:09:37 np0005555140 python3[7512]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-d121-f399-000000001f03-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:38 np0005555140 python3[7540]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:38 np0005555140 python3[7566]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:38 np0005555140 python3[7593]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:38 np0005555140 python3[7619]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:39 np0005555140 python3[7645]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:39 np0005555140 python3[7723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:09:40 np0005555140 python3[7796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765444179.5281901-478-68537838797475/source _original_basename=tmpzun54iis follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:09:41 np0005555140 python3[7846]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:09:41 np0005555140 systemd[1]: Reloading.
Dec 11 04:09:41 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:09:42 np0005555140 python3[7901]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 11 04:09:42 np0005555140 python3[7927]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:43 np0005555140 python3[7955]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:43 np0005555140 python3[7983]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:43 np0005555140 python3[8011]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:44 np0005555140 python3[8038]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-d121-f399-000000001f0a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:09:44 np0005555140 python3[8068]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 11 04:09:46 np0005555140 systemd[1]: session-3.scope: Deactivated successfully.
Dec 11 04:09:46 np0005555140 systemd[1]: session-3.scope: Consumed 4.014s CPU time.
Dec 11 04:09:46 np0005555140 systemd-logind[787]: Session 3 logged out. Waiting for processes to exit.
Dec 11 04:09:46 np0005555140 systemd-logind[787]: Removed session 3.
Dec 11 04:09:48 np0005555140 systemd-logind[787]: New session 4 of user zuul.
Dec 11 04:09:48 np0005555140 systemd[1]: Started Session 4 of User zuul.
Dec 11 04:09:49 np0005555140 python3[8101]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 11 04:10:06 np0005555140 kernel: SELinux:  Converting 386 SID table entries...
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:10:06 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  Converting 386 SID table entries...
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:10:15 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  Converting 386 SID table entries...
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:10:24 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:10:26 np0005555140 setsebool[8167]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 11 04:10:26 np0005555140 setsebool[8167]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 11 04:10:38 np0005555140 kernel: SELinux:  Converting 389 SID table entries...
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:10:38 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:10:57 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 11 04:10:57 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:10:57 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:10:57 np0005555140 systemd[1]: Reloading.
Dec 11 04:10:57 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:10:58 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:11:13 np0005555140 python3[18220]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-0974-bca5-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:11:13 np0005555140 kernel: evm: overlay not supported
Dec 11 04:11:13 np0005555140 systemd[4315]: Starting D-Bus User Message Bus...
Dec 11 04:11:13 np0005555140 dbus-broker-launch[18691]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 11 04:11:13 np0005555140 dbus-broker-launch[18691]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 11 04:11:13 np0005555140 systemd[4315]: Started D-Bus User Message Bus.
Dec 11 04:11:13 np0005555140 dbus-broker-lau[18691]: Ready
Dec 11 04:11:13 np0005555140 systemd[4315]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 11 04:11:13 np0005555140 systemd[4315]: Created slice Slice /user.
Dec 11 04:11:13 np0005555140 systemd[4315]: podman-18617.scope: unit configures an IP firewall, but not running as root.
Dec 11 04:11:13 np0005555140 systemd[4315]: (This warning is only shown for the first unit using IP firewalling.)
Dec 11 04:11:13 np0005555140 systemd[4315]: Started podman-18617.scope.
Dec 11 04:11:13 np0005555140 systemd[4315]: Started podman-pause-016681c2.scope.
Dec 11 04:11:14 np0005555140 systemd[1]: session-4.scope: Deactivated successfully.
Dec 11 04:11:14 np0005555140 systemd[1]: session-4.scope: Consumed 1min 2.995s CPU time.
Dec 11 04:11:14 np0005555140 systemd-logind[787]: Session 4 logged out. Waiting for processes to exit.
Dec 11 04:11:14 np0005555140 systemd-logind[787]: Removed session 4.
Dec 11 04:11:17 np0005555140 irqbalance[781]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 11 04:11:17 np0005555140 irqbalance[781]: IRQ 27 affinity is now unmanaged
Dec 11 04:11:36 np0005555140 systemd-logind[787]: New session 5 of user zuul.
Dec 11 04:11:36 np0005555140 systemd[1]: Started Session 5 of User zuul.
Dec 11 04:11:36 np0005555140 python3[29226]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLFs02vGVJiplne5xjduG8zDo+7aGLQ0QInCL7vwOT4CzOdf99xfWtXXki6YvTaqlaBA/7c7oKhpFoxLi7LM9jk= zuul@np0005555139.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:11:36 np0005555140 python3[29371]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLFs02vGVJiplne5xjduG8zDo+7aGLQ0QInCL7vwOT4CzOdf99xfWtXXki6YvTaqlaBA/7c7oKhpFoxLi7LM9jk= zuul@np0005555139.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:11:37 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:11:37 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:11:37 np0005555140 systemd[1]: man-db-cache-update.service: Consumed 46.584s CPU time.
Dec 11 04:11:37 np0005555140 systemd[1]: run-rdf35ba4aadbc4566afb245fcc6d14c3b.service: Deactivated successfully.
Dec 11 04:11:37 np0005555140 python3[29640]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005555140.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 11 04:11:38 np0005555140 python3[29674]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLFs02vGVJiplne5xjduG8zDo+7aGLQ0QInCL7vwOT4CzOdf99xfWtXXki6YvTaqlaBA/7c7oKhpFoxLi7LM9jk= zuul@np0005555139.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 04:11:38 np0005555140 python3[29752]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:11:39 np0005555140 python3[29825]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765444298.3562355-118-48274079667888/source _original_basename=tmpz9j0ktud follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:11:39 np0005555140 python3[29875]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 11 04:11:39 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 04:11:39 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 04:11:40 np0005555140 systemd-hostnamed[29879]: Changed pretty hostname to 'compute-0'
Dec 11 04:11:40 np0005555140 systemd-hostnamed[29879]: Hostname set to <compute-0> (static)
Dec 11 04:11:40 np0005555140 NetworkManager[7201]: <info>  [1765444300.0304] hostname: static hostname changed from "np0005555140.novalocal" to "compute-0"
Dec 11 04:11:40 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:11:40 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:11:40 np0005555140 systemd[1]: session-5.scope: Deactivated successfully.
Dec 11 04:11:40 np0005555140 systemd[1]: session-5.scope: Consumed 2.225s CPU time.
Dec 11 04:11:40 np0005555140 systemd-logind[787]: Session 5 logged out. Waiting for processes to exit.
Dec 11 04:11:40 np0005555140 systemd-logind[787]: Removed session 5.
Dec 11 04:11:50 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:12:10 np0005555140 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 04:16:01 np0005555140 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 11 04:16:01 np0005555140 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 11 04:16:01 np0005555140 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 11 04:16:01 np0005555140 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 11 04:16:20 np0005555140 systemd-logind[787]: New session 6 of user zuul.
Dec 11 04:16:20 np0005555140 systemd[1]: Started Session 6 of User zuul.
Dec 11 04:16:20 np0005555140 python3[29979]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:16:22 np0005555140 python3[30095]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:22 np0005555140 python3[30168]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:22 np0005555140 python3[30194]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:23 np0005555140 python3[30267]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:23 np0005555140 python3[30293]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:23 np0005555140 python3[30366]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:24 np0005555140 python3[30392]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:24 np0005555140 python3[30465]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:24 np0005555140 python3[30491]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:24 np0005555140 python3[30564]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:25 np0005555140 python3[30590]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:25 np0005555140 python3[30663]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:25 np0005555140 python3[30689]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 04:16:26 np0005555140 python3[30762]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765444581.927646-33521-143268075069561/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:16:39 np0005555140 python3[30820]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:20:01 np0005555140 systemd[1]: Starting dnf makecache...
Dec 11 04:20:01 np0005555140 dnf[30824]: Failed determining last makecache time.
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-barbican-42b4c41831408a8e323  61 kB/s |  13 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.5 MB/s |  65 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-stevedore-c4acc5639fd2329372142 1.9 MB/s | 131 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.4 MB/s |  32 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-os-refresh-config-9bfc52b5049be2d8de61  14 MB/s | 349 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 2.1 MB/s |  42 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-designate-tests-tempest-347fdbc 839 kB/s |  18 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-glance-1fd12c29b339f30fe823e 877 kB/s |  18 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.5 MB/s |  29 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-manila-3c01b7181572c95dac462 1.1 MB/s |  25 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-whitebox-neutron-tests-tempest- 6.2 MB/s | 154 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-octavia-ba397f07a7331190208c 1.2 MB/s |  26 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-watcher-c014f81a8647287f6dcc 1.0 MB/s |  16 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-ansible-config_template-5ccaa22121a7ff 361 kB/s | 7.4 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 6.1 MB/s | 144 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-swift-dc98a8463506ac520c469a 688 kB/s |  14 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-python-tempestconf-8515371b7cceebd4282 2.3 MB/s |  53 kB     00:00
Dec 11 04:20:02 np0005555140 dnf[30824]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.0 MB/s |  96 kB     00:00
Dec 11 04:20:03 np0005555140 dnf[30824]: CentOS Stream 9 - BaseOS                         49 kB/s | 7.0 kB     00:00
Dec 11 04:20:03 np0005555140 dnf[30824]: CentOS Stream 9 - AppStream                      75 kB/s | 7.4 kB     00:00
Dec 11 04:20:03 np0005555140 dnf[30824]: CentOS Stream 9 - CRB                            74 kB/s | 6.9 kB     00:00
Dec 11 04:20:03 np0005555140 dnf[30824]: CentOS Stream 9 - Extras packages                73 kB/s | 8.3 kB     00:00
Dec 11 04:20:03 np0005555140 dnf[30824]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Dec 11 04:20:04 np0005555140 dnf[30824]: dlrn-antelope-build-deps                         17 MB/s | 461 kB     00:00
Dec 11 04:20:04 np0005555140 dnf[30824]: centos9-rabbitmq                                6.8 MB/s | 123 kB     00:00
Dec 11 04:20:04 np0005555140 dnf[30824]: centos9-storage                                  17 MB/s | 415 kB     00:00
Dec 11 04:20:04 np0005555140 dnf[30824]: centos9-opstools                                4.7 MB/s |  51 kB     00:00
Dec 11 04:20:04 np0005555140 dnf[30824]: NFV SIG OpenvSwitch                              24 MB/s | 456 kB     00:00
Dec 11 04:20:05 np0005555140 dnf[30824]: repo-setup-centos-appstream                      87 MB/s |  26 MB     00:00
Dec 11 04:20:11 np0005555140 dnf[30824]: repo-setup-centos-baseos                         71 MB/s | 8.8 MB     00:00
Dec 11 04:20:12 np0005555140 dnf[30824]: repo-setup-centos-highavailability               30 MB/s | 744 kB     00:00
Dec 11 04:20:13 np0005555140 dnf[30824]: repo-setup-centos-powertools                     75 MB/s | 7.4 MB     00:00
Dec 11 04:20:15 np0005555140 dnf[30824]: Extra Packages for Enterprise Linux 9 - x86_64   31 MB/s |  20 MB     00:00
Dec 11 04:20:31 np0005555140 dnf[30824]: Metadata cache created.
Dec 11 04:20:31 np0005555140 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 11 04:20:31 np0005555140 systemd[1]: Finished dnf makecache.
Dec 11 04:20:31 np0005555140 systemd[1]: dnf-makecache.service: Consumed 28.162s CPU time.
Dec 11 04:21:38 np0005555140 systemd[1]: session-6.scope: Deactivated successfully.
Dec 11 04:21:38 np0005555140 systemd[1]: session-6.scope: Consumed 4.699s CPU time.
Dec 11 04:21:38 np0005555140 systemd-logind[787]: Session 6 logged out. Waiting for processes to exit.
Dec 11 04:21:38 np0005555140 systemd-logind[787]: Removed session 6.
Dec 11 04:27:28 np0005555140 systemd-logind[787]: New session 7 of user zuul.
Dec 11 04:27:28 np0005555140 systemd[1]: Started Session 7 of User zuul.
Dec 11 04:27:29 np0005555140 python3.9[31081]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:27:31 np0005555140 python3.9[31262]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:27:40 np0005555140 systemd[1]: session-7.scope: Deactivated successfully.
Dec 11 04:27:40 np0005555140 systemd[1]: session-7.scope: Consumed 8.078s CPU time.
Dec 11 04:27:40 np0005555140 systemd-logind[787]: Session 7 logged out. Waiting for processes to exit.
Dec 11 04:27:40 np0005555140 systemd-logind[787]: Removed session 7.
Dec 11 04:27:48 np0005555140 systemd-logind[787]: New session 8 of user zuul.
Dec 11 04:27:48 np0005555140 systemd[1]: Started Session 8 of User zuul.
Dec 11 04:27:49 np0005555140 python3.9[31473]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:27:50 np0005555140 systemd[1]: session-8.scope: Deactivated successfully.
Dec 11 04:27:50 np0005555140 systemd-logind[787]: Session 8 logged out. Waiting for processes to exit.
Dec 11 04:27:50 np0005555140 systemd-logind[787]: Removed session 8.
Dec 11 04:28:06 np0005555140 systemd-logind[787]: New session 9 of user zuul.
Dec 11 04:28:06 np0005555140 systemd[1]: Started Session 9 of User zuul.
Dec 11 04:28:06 np0005555140 python3.9[31656]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 11 04:28:08 np0005555140 python3.9[31830]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:28:09 np0005555140 python3.9[31982]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:28:09 np0005555140 python3.9[32135]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:28:10 np0005555140 python3.9[32287]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:28:11 np0005555140 python3.9[32439]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:28:12 np0005555140 python3.9[32562]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445290.9069302-73-77356470029764/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:28:12 np0005555140 python3.9[32714]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:28:13 np0005555140 python3.9[32870]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:28:14 np0005555140 python3.9[33022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:28:15 np0005555140 python3.9[33172]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:28:18 np0005555140 python3.9[33425]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:28:18 np0005555140 python3.9[33575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:28:20 np0005555140 python3.9[33729]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:28:21 np0005555140 python3.9[33887]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:28:21 np0005555140 python3.9[33971]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:29:29 np0005555140 systemd[1]: Reloading.
Dec 11 04:29:29 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:29:29 np0005555140 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 11 04:29:29 np0005555140 systemd[1]: Reloading.
Dec 11 04:29:29 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:29:29 np0005555140 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 11 04:29:29 np0005555140 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 11 04:29:29 np0005555140 systemd[1]: Reloading.
Dec 11 04:29:30 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:29:30 np0005555140 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 11 04:29:30 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:29:30 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:29:30 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:30:35 np0005555140 kernel: SELinux:  Converting 2719 SID table entries...
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:30:35 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:30:35 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 11 04:30:35 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:30:35 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:30:35 np0005555140 systemd[1]: Reloading.
Dec 11 04:30:35 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:30:36 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:30:37 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:30:37 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:30:37 np0005555140 systemd[1]: man-db-cache-update.service: Consumed 1.150s CPU time.
Dec 11 04:30:37 np0005555140 systemd[1]: run-r990ab1e7a20149a68be45a3ab9bec719.service: Deactivated successfully.
Dec 11 04:30:37 np0005555140 python3.9[35500]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:30:39 np0005555140 python3.9[35782]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 11 04:30:39 np0005555140 python3.9[35934]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 11 04:30:42 np0005555140 python3.9[36087]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:30:43 np0005555140 python3.9[36239]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 11 04:30:45 np0005555140 python3.9[36391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:30:45 np0005555140 python3.9[36543]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:30:46 np0005555140 python3.9[36666]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445445.2040305-236-57689178642829/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:30:51 np0005555140 python3.9[36818]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:30:52 np0005555140 python3.9[36970]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:30:53 np0005555140 python3.9[37123]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:30:54 np0005555140 python3.9[37275]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 11 04:30:55 np0005555140 python3.9[37428]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:30:55 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:30:56 np0005555140 python3.9[37587]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 04:30:56 np0005555140 python3.9[37747]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 11 04:30:57 np0005555140 python3.9[37900]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:30:58 np0005555140 python3.9[38058]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 11 04:30:59 np0005555140 python3.9[38210]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:31:01 np0005555140 python3.9[38363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:02 np0005555140 python3.9[38515]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:31:02 np0005555140 python3.9[38638]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445461.5843556-355-195149825851733/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:03 np0005555140 python3.9[38790]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:31:03 np0005555140 systemd[1]: Starting Load Kernel Modules...
Dec 11 04:31:03 np0005555140 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 11 04:31:03 np0005555140 kernel: Bridge firewalling registered
Dec 11 04:31:03 np0005555140 systemd-modules-load[38794]: Inserted module 'br_netfilter'
Dec 11 04:31:03 np0005555140 systemd[1]: Finished Load Kernel Modules.
Dec 11 04:31:04 np0005555140 python3.9[38949]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:31:05 np0005555140 python3.9[39072]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445464.053365-378-27063324996230/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:05 np0005555140 python3.9[39224]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:31:14 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:31:14 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:31:14 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:31:14 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:31:14 np0005555140 systemd[1]: Reloading.
Dec 11 04:31:14 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:31:14 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:31:15 np0005555140 python3.9[40550]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:31:16 np0005555140 python3.9[41486]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 11 04:31:17 np0005555140 python3.9[42405]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:31:18 np0005555140 python3.9[43253]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:18 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:31:18 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:31:18 np0005555140 systemd[1]: man-db-cache-update.service: Consumed 4.679s CPU time.
Dec 11 04:31:18 np0005555140 systemd[1]: run-r72051fdaf0ae4ee18b893b63f347007b.service: Deactivated successfully.
Dec 11 04:31:18 np0005555140 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 11 04:31:18 np0005555140 systemd[1]: Starting Authorization Manager...
Dec 11 04:31:18 np0005555140 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 11 04:31:18 np0005555140 polkitd[43614]: Started polkitd version 0.117
Dec 11 04:31:18 np0005555140 systemd[1]: Started Authorization Manager.
Dec 11 04:31:19 np0005555140 python3.9[43784]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:31:19 np0005555140 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 11 04:31:19 np0005555140 systemd[1]: tuned.service: Deactivated successfully.
Dec 11 04:31:19 np0005555140 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 11 04:31:19 np0005555140 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 11 04:31:19 np0005555140 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 11 04:31:20 np0005555140 python3.9[43945]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 11 04:31:22 np0005555140 python3.9[44097]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:31:22 np0005555140 systemd[1]: Reloading.
Dec 11 04:31:22 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:31:23 np0005555140 python3.9[44286]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:31:23 np0005555140 systemd[1]: Reloading.
Dec 11 04:31:24 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:31:24 np0005555140 python3.9[44475]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:25 np0005555140 python3.9[44628]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:25 np0005555140 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 11 04:31:26 np0005555140 python3.9[44781]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:27 np0005555140 irqbalance[781]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 11 04:31:27 np0005555140 irqbalance[781]: IRQ 26 affinity is now unmanaged
Dec 11 04:31:28 np0005555140 python3.9[44943]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:28 np0005555140 python3.9[45096]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:31:28 np0005555140 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 11 04:31:28 np0005555140 systemd[1]: Stopped Apply Kernel Variables.
Dec 11 04:31:28 np0005555140 systemd[1]: Stopping Apply Kernel Variables...
Dec 11 04:31:28 np0005555140 systemd[1]: Starting Apply Kernel Variables...
Dec 11 04:31:28 np0005555140 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 11 04:31:29 np0005555140 systemd[1]: Finished Apply Kernel Variables.
Dec 11 04:31:29 np0005555140 systemd[1]: session-9.scope: Deactivated successfully.
Dec 11 04:31:29 np0005555140 systemd[1]: session-9.scope: Consumed 2min 16.741s CPU time.
Dec 11 04:31:29 np0005555140 systemd-logind[787]: Session 9 logged out. Waiting for processes to exit.
Dec 11 04:31:29 np0005555140 systemd-logind[787]: Removed session 9.
Dec 11 04:31:36 np0005555140 systemd-logind[787]: New session 10 of user zuul.
Dec 11 04:31:36 np0005555140 systemd[1]: Started Session 10 of User zuul.
Dec 11 04:31:37 np0005555140 python3.9[45279]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:31:38 np0005555140 python3.9[45433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:31:39 np0005555140 python3.9[45589]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:40 np0005555140 python3.9[45740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:31:41 np0005555140 python3.9[45896]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:31:42 np0005555140 python3.9[45980]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:31:43 np0005555140 python3.9[46133]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:31:45 np0005555140 python3.9[46304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:31:45 np0005555140 python3.9[46456]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:31:45 np0005555140 podman[46457]: 2025-12-11 09:31:45.774495252 +0000 UTC m=+0.050761175 system refresh
Dec 11 04:31:46 np0005555140 python3.9[46619]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:31:46 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:31:47 np0005555140 python3.9[46742]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445505.9593804-109-158510345357596/.source.json follow=False _original_basename=podman_network_config.j2 checksum=8fde862fb4c202c94752ecc2a92f8508abbb90a2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:31:48 np0005555140 python3.9[46894]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:31:48 np0005555140 python3.9[47017]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445507.4576511-124-78802215597424/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:49 np0005555140 python3.9[47169]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:49 np0005555140 python3.9[47321]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:50 np0005555140 python3.9[47473]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:51 np0005555140 python3.9[47625]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:31:52 np0005555140 python3.9[47775]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:31:53 np0005555140 python3.9[47929]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:31:55 np0005555140 python3.9[48082]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:31:58 np0005555140 python3.9[48242]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:00 np0005555140 python3.9[48395]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:02 np0005555140 python3.9[48548]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:05 np0005555140 python3.9[48704]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:15 np0005555140 python3.9[48873]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:17 np0005555140 python3.9[49026]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:37 np0005555140 python3.9[49363]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:32:39 np0005555140 python3.9[49519]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:32:40 np0005555140 python3.9[49694]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:32:41 np0005555140 python3.9[49817]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765445560.1441805-272-5629624772281/.source.json _original_basename=.5rp0o06x follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:32:42 np0005555140 python3.9[49969]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:32:42 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:45 np0005555140 systemd[1]: var-lib-containers-storage-overlay-compat1728288814-lower\x2dmapped.mount: Deactivated successfully.
Dec 11 04:32:48 np0005555140 podman[49979]: 2025-12-11 09:32:48.300749501 +0000 UTC m=+6.025723748 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Dec 11 04:32:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:49 np0005555140 python3.9[50276]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:32:49 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:59 np0005555140 podman[50289]: 2025-12-11 09:32:59.902474283 +0000 UTC m=+10.407328452 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:32:59 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:59 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:32:59 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:00 np0005555140 python3.9[50585]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:33:00 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:02 np0005555140 podman[50598]: 2025-12-11 09:33:02.205789159 +0000 UTC m=+1.271802618 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f
Dec 11 04:33:02 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:02 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:02 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:03 np0005555140 python3.9[50831]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:33:03 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:16 np0005555140 podman[50843]: 2025-12-11 09:33:16.668617431 +0000 UTC m=+13.272667074 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Dec 11 04:33:16 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:16 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:16 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:17 np0005555140 python3.9[51100]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:33:17 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:23 np0005555140 podman[51112]: 2025-12-11 09:33:23.407316741 +0000 UTC m=+5.621532981 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf
Dec 11 04:33:23 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:23 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:23 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:24 np0005555140 python3.9[51369]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 04:33:26 np0005555140 podman[51381]: 2025-12-11 09:33:26.78869391 +0000 UTC m=+2.424697656 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 11 04:33:26 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:26 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:26 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:33:27 np0005555140 systemd-logind[787]: Session 10 logged out. Waiting for processes to exit.
Dec 11 04:33:27 np0005555140 systemd[1]: session-10.scope: Deactivated successfully.
Dec 11 04:33:27 np0005555140 systemd[1]: session-10.scope: Consumed 1min 47.103s CPU time.
Dec 11 04:33:27 np0005555140 systemd-logind[787]: Removed session 10.
Dec 11 04:33:34 np0005555140 systemd-logind[787]: New session 11 of user zuul.
Dec 11 04:33:34 np0005555140 systemd[1]: Started Session 11 of User zuul.
Dec 11 04:33:35 np0005555140 python3.9[51684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:33:36 np0005555140 python3.9[51840]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 11 04:33:37 np0005555140 python3.9[51993]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:33:38 np0005555140 python3.9[52151]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 04:33:39 np0005555140 python3.9[52311]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:33:40 np0005555140 python3.9[52395]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:33:42 np0005555140 python3.9[52557]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:33:55 np0005555140 kernel: SELinux:  Converting 2732 SID table entries...
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:33:55 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:33:55 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 11 04:33:55 np0005555140 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 11 04:33:57 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:33:57 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:33:57 np0005555140 systemd[1]: Reloading.
Dec 11 04:33:57 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:33:57 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:33:57 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:33:58 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:33:58 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:33:58 np0005555140 systemd[1]: run-r0329e92eacbd41e58d5eaa0839655100.service: Deactivated successfully.
Dec 11 04:33:59 np0005555140 python3.9[53656]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:33:59 np0005555140 systemd[1]: Reloading.
Dec 11 04:33:59 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:33:59 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:33:59 np0005555140 systemd[1]: Starting Open vSwitch Database Unit...
Dec 11 04:33:59 np0005555140 chown[53698]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 11 04:33:59 np0005555140 ovs-ctl[53703]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 11 04:33:59 np0005555140 ovs-ctl[53703]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 11 04:33:59 np0005555140 ovs-ctl[53703]: Starting ovsdb-server [  OK  ]
Dec 11 04:34:00 np0005555140 ovs-vsctl[53752]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 11 04:34:00 np0005555140 ovs-vsctl[53772]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2f07ba53-a431-4669-9e8c-dcf2fed72095\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 11 04:34:00 np0005555140 ovs-ctl[53703]: Configuring Open vSwitch system IDs [  OK  ]
Dec 11 04:34:00 np0005555140 ovs-ctl[53703]: Enabling remote OVSDB managers [  OK  ]
Dec 11 04:34:00 np0005555140 systemd[1]: Started Open vSwitch Database Unit.
Dec 11 04:34:00 np0005555140 ovs-vsctl[53778]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 11 04:34:00 np0005555140 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 11 04:34:00 np0005555140 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 11 04:34:00 np0005555140 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 11 04:34:00 np0005555140 kernel: openvswitch: Open vSwitch switching datapath
Dec 11 04:34:00 np0005555140 ovs-ctl[53823]: Inserting openvswitch module [  OK  ]
Dec 11 04:34:00 np0005555140 ovs-ctl[53791]: Starting ovs-vswitchd [  OK  ]
Dec 11 04:34:00 np0005555140 ovs-vsctl[53843]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 11 04:34:00 np0005555140 ovs-ctl[53791]: Enabling remote OVSDB managers [  OK  ]
Dec 11 04:34:00 np0005555140 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 11 04:34:00 np0005555140 systemd[1]: Starting Open vSwitch...
Dec 11 04:34:00 np0005555140 systemd[1]: Finished Open vSwitch.
Dec 11 04:34:01 np0005555140 python3.9[53995]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:34:02 np0005555140 python3.9[54147]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 11 04:34:03 np0005555140 kernel: SELinux:  Converting 2746 SID table entries...
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:34:03 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:34:04 np0005555140 python3.9[54302]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:34:05 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 11 04:34:05 np0005555140 python3.9[54460]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:34:07 np0005555140 python3.9[54613]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:34:09 np0005555140 python3.9[54900]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 04:34:09 np0005555140 python3.9[55050]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:34:10 np0005555140 python3.9[55204]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:34:12 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:34:12 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:34:12 np0005555140 systemd[1]: Reloading.
Dec 11 04:34:12 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:34:12 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:34:12 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:34:13 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:34:13 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:34:13 np0005555140 systemd[1]: run-r9851bf83f1d7464f932a10383e0c972b.service: Deactivated successfully.
Dec 11 04:34:13 np0005555140 python3.9[55521]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:34:13 np0005555140 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 11 04:34:13 np0005555140 systemd[1]: Stopped Network Manager Wait Online.
Dec 11 04:34:13 np0005555140 systemd[1]: Stopping Network Manager Wait Online...
Dec 11 04:34:13 np0005555140 systemd[1]: Stopping Network Manager...
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9847] caught SIGTERM, shutting down normally.
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9866] dhcp4 (eth0): canceled DHCP transaction
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9866] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9866] dhcp4 (eth0): state changed no lease
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9869] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 04:34:13 np0005555140 NetworkManager[7201]: <info>  [1765445653.9931] exiting (success)
Dec 11 04:34:14 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:34:14 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:34:14 np0005555140 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 11 04:34:14 np0005555140 systemd[1]: Stopped Network Manager.
Dec 11 04:34:14 np0005555140 systemd[1]: NetworkManager.service: Consumed 11.718s CPU time, 4.1M memory peak, read 0B from disk, written 35.5K to disk.
Dec 11 04:34:14 np0005555140 systemd[1]: Starting Network Manager...
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.0657] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:44e8beb3-23e7-4e74-aa29-1a4573300217)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.0661] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.0716] manager[0x55cd8b0e8000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 04:34:14 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 04:34:14 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1582] hostname: hostname: using hostnamed
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1583] hostname: static hostname changed from (none) to "compute-0"
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1587] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1592] manager[0x55cd8b0e8000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1592] manager[0x55cd8b0e8000]: rfkill: WWAN hardware radio set enabled
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1611] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-ovs.so)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1619] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1619] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1620] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1621] manager: Networking is enabled by state file
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1622] settings: Loaded settings plugin: keyfile (internal)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1626] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1649] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1657] dhcp: init: Using DHCP client 'internal'
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1659] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1664] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1667] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1673] device (lo): Activation: starting connection 'lo' (d8ac9655-2a62-4b5a-8019-334c588c1842)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1678] device (eth0): carrier: link connected
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1682] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1685] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1685] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1690] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1694] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1698] device (eth1): carrier: link connected
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1701] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1704] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b) (indicated)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1705] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1708] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1712] device (eth1): Activation: starting connection 'ci-private-network' (f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b)
Dec 11 04:34:14 np0005555140 systemd[1]: Started Network Manager.
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1717] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1724] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1726] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1728] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1729] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1731] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1733] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1734] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1736] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1741] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.1743] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2039] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2050] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2055] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2057] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2061] device (lo): Activation: successful, device activated.
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2089] dhcp4 (eth0): state changed new lease, address=38.102.83.70
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2097] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2177] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2184] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2196] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2203] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2208] device (eth1): Activation: successful, device activated.
Dec 11 04:34:14 np0005555140 systemd[1]: Starting Network Manager Wait Online...
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2226] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2231] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2237] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2240] device (eth0): Activation: successful, device activated.
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2245] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 04:34:14 np0005555140 NetworkManager[55531]: <info>  [1765445654.2270] manager: startup complete
Dec 11 04:34:14 np0005555140 systemd[1]: Finished Network Manager Wait Online.
Dec 11 04:34:14 np0005555140 python3.9[55747]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:34:21 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:34:21 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:34:21 np0005555140 systemd[1]: Reloading.
Dec 11 04:34:21 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:34:21 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:34:21 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:34:22 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:34:22 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:34:22 np0005555140 systemd[1]: run-rcd52b8f0fb8d408f830b93740a52f3fc.service: Deactivated successfully.
Dec 11 04:34:23 np0005555140 python3.9[56208]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:34:24 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:34:24 np0005555140 python3.9[56360]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:25 np0005555140 python3.9[56514]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:25 np0005555140 python3.9[56666]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:26 np0005555140 python3.9[56818]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:27 np0005555140 python3.9[56970]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:28 np0005555140 python3.9[57122]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:34:28 np0005555140 python3.9[57245]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445667.605671-229-241482320082213/.source _original_basename=.tz2nyet4 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:29 np0005555140 python3.9[57397]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:30 np0005555140 python3.9[57549]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 11 04:34:31 np0005555140 python3.9[57701]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:33 np0005555140 python3.9[58128]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 11 04:34:34 np0005555140 ansible-async_wrapper.py[58303]: Invoked with j318376499540 300 /home/zuul/.ansible/tmp/ansible-tmp-1765445673.472142-295-76768781808696/AnsiballZ_edpm_os_net_config.py _
Dec 11 04:34:34 np0005555140 ansible-async_wrapper.py[58306]: Starting module and watcher
Dec 11 04:34:34 np0005555140 ansible-async_wrapper.py[58306]: Start watching 58307 (300)
Dec 11 04:34:34 np0005555140 ansible-async_wrapper.py[58307]: Start module (58307)
Dec 11 04:34:34 np0005555140 ansible-async_wrapper.py[58303]: Return async_wrapper task started.
Dec 11 04:34:34 np0005555140 python3.9[58308]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 11 04:34:35 np0005555140 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 11 04:34:35 np0005555140 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 11 04:34:35 np0005555140 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 11 04:34:35 np0005555140 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 11 04:34:35 np0005555140 kernel: cfg80211: failed to load regulatory.db
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.0991] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1010] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1605] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1607] audit: op="connection-add" uuid="29f4ba94-072e-4a21-872a-03f9b0d953f8" name="br-ex-br" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1630] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1632] audit: op="connection-add" uuid="b01056fb-05f0-4851-8640-619d9d4d574c" name="br-ex-port" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1652] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1655] audit: op="connection-add" uuid="1b231dce-0e85-4abb-a937-4acb398a5fbf" name="eth1-port" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1675] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1677] audit: op="connection-add" uuid="970b2b9e-b94d-467c-875c-d9dec16d35e4" name="vlan20-port" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1695] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1697] audit: op="connection-add" uuid="6fe66a4c-493a-49e3-a6c7-d8275dad3237" name="vlan21-port" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1716] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1719] audit: op="connection-add" uuid="6e7ec721-00bd-4bf2-a975-026be531c726" name="vlan22-port" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1754] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1783] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1786] audit: op="connection-add" uuid="48b55c05-dc7e-403e-a0ca-461e6c9e71e4" name="br-ex-if" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1847] audit: op="connection-update" uuid="f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b" name="ci-private-network" args="ipv6.method,ipv6.routes,ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv4.method,ipv4.routing-rules,ipv4.never-default,ipv4.dns,ipv4.addresses,ipv4.routes,ovs-external-ids.data,connection.master,connection.slave-type,connection.timestamp,connection.controller,connection.port-type,ovs-interface.type" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1874] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1877] audit: op="connection-add" uuid="22c04421-5ad6-46e1-8603-dab71f05b610" name="vlan20-if" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1903] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1906] audit: op="connection-add" uuid="b5b6fe74-9f5f-4552-ba9d-a492abfe2f51" name="vlan21-if" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1934] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1936] audit: op="connection-add" uuid="1bcfbfa9-d795-42f2-87f2-d69ea571cad2" name="vlan22-if" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1952] audit: op="connection-delete" uuid="bf1dd5df-db4b-356a-83a4-08676bcdd19e" name="Wired connection 1" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1969] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.1972] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1980] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1985] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (29f4ba94-072e-4a21-872a-03f9b0d953f8)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1986] audit: op="connection-activate" uuid="29f4ba94-072e-4a21-872a-03f9b0d953f8" name="br-ex-br" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1988] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.1989] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1995] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.1999] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b01056fb-05f0-4851-8640-619d9d4d574c)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2001] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2003] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2008] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2013] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1b231dce-0e85-4abb-a937-4acb398a5fbf)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2015] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2016] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2021] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2026] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (970b2b9e-b94d-467c-875c-d9dec16d35e4)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2027] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2029] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2034] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2038] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6fe66a4c-493a-49e3-a6c7-d8275dad3237)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2040] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2041] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2047] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2052] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6e7ec721-00bd-4bf2-a975-026be531c726)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2053] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2055] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2057] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2064] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2065] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2069] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2073] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (48b55c05-dc7e-403e-a0ca-461e6c9e71e4)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2074] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2077] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2079] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2080] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2081] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2092] device (eth1): disconnecting for new activation request.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2093] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2095] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2096] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2097] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2100] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2102] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2106] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2111] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (22c04421-5ad6-46e1-8603-dab71f05b610)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2112] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2116] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2118] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2120] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2124] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2125] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2129] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2134] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (b5b6fe74-9f5f-4552-ba9d-a492abfe2f51)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2135] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2139] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2142] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2143] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2147] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <warn>  [1765445676.2149] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2152] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2157] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (1bcfbfa9-d795-42f2-87f2-d69ea571cad2)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2158] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2162] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2164] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2166] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2169] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2184] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2187] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2191] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2194] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2202] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2209] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2214] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2218] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 kernel: ovs-system: entered promiscuous mode
Dec 11 04:34:36 np0005555140 kernel: Timeout policy base is empty
Dec 11 04:34:36 np0005555140 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2239] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2244] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 systemd-udevd[58313]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2247] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2253] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2257] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2262] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2266] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2272] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2274] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2278] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2282] dhcp4 (eth0): canceled DHCP transaction
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2282] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2282] dhcp4 (eth0): state changed no lease
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2283] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2294] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2298] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58309 uid=0 result="fail" reason="Device is not activated"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2305] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2349] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2354] dhcp4 (eth0): state changed new lease, address=38.102.83.70
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2362] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2418] device (eth1): disconnecting for new activation request.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2419] audit: op="connection-activate" uuid="f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b" name="ci-private-network" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2419] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2528] device (eth1): Activation: starting connection 'ci-private-network' (f0b1e71c-23c8-5785-97b9-a0c12a4c5b4b)
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2533] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2555] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2560] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2568] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2575] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2582] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2585] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2586] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2589] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2591] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2593] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58309 uid=0 result="success"
Dec 11 04:34:36 np0005555140 kernel: br-ex: entered promiscuous mode
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2614] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2624] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2627] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2631] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2635] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2639] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2642] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2646] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2650] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2653] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2660] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2666] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2670] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2717] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2720] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2725] device (eth1): Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2736] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2746] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2777] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2779] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2783] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 kernel: vlan22: entered promiscuous mode
Dec 11 04:34:36 np0005555140 kernel: vlan20: entered promiscuous mode
Dec 11 04:34:36 np0005555140 systemd-udevd[58315]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2914] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2929] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2947] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2950] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.2953] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 kernel: vlan21: entered promiscuous mode
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3011] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3023] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3046] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3048] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3053] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3090] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3105] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3131] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3133] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 04:34:36 np0005555140 NetworkManager[55531]: <info>  [1765445676.3137] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 04:34:37 np0005555140 NetworkManager[55531]: <info>  [1765445677.4196] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58309 uid=0 result="success"
Dec 11 04:34:37 np0005555140 NetworkManager[55531]: <info>  [1765445677.5647] checkpoint[0x55cd8b0bd950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 11 04:34:37 np0005555140 NetworkManager[55531]: <info>  [1765445677.5650] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58309 uid=0 result="success"
Dec 11 04:34:37 np0005555140 NetworkManager[55531]: <info>  [1765445677.8212] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58309 uid=0 result="success"
Dec 11 04:34:37 np0005555140 NetworkManager[55531]: <info>  [1765445677.8226] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58309 uid=0 result="success"
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.0252] audit: op="networking-control" arg="global-dns-configuration" pid=58309 uid=0 result="success"
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.0288] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.0323] audit: op="networking-control" arg="global-dns-configuration" pid=58309 uid=0 result="success"
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.0346] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58309 uid=0 result="success"
Dec 11 04:34:38 np0005555140 python3.9[58643]: ansible-ansible.legacy.async_status Invoked with jid=j318376499540.58303 mode=status _async_dir=/root/.ansible_async
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.1653] checkpoint[0x55cd8b0bda20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 11 04:34:38 np0005555140 NetworkManager[55531]: <info>  [1765445678.1657] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58309 uid=0 result="success"
Dec 11 04:34:38 np0005555140 ansible-async_wrapper.py[58307]: Module complete (58307)
Dec 11 04:34:39 np0005555140 ansible-async_wrapper.py[58306]: Done in kid B.
Dec 11 04:34:41 np0005555140 python3.9[58747]: ansible-ansible.legacy.async_status Invoked with jid=j318376499540.58303 mode=status _async_dir=/root/.ansible_async
Dec 11 04:34:41 np0005555140 python3.9[58847]: ansible-ansible.legacy.async_status Invoked with jid=j318376499540.58303 mode=cleanup _async_dir=/root/.ansible_async
Dec 11 04:34:42 np0005555140 python3.9[58999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:34:43 np0005555140 python3.9[59122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445682.1792865-322-92152325566254/.source.returncode _original_basename=.2lfguoz0 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:43 np0005555140 python3.9[59274]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:34:44 np0005555140 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 04:34:44 np0005555140 python3.9[59399]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445683.4112737-338-123936167554113/.source.cfg _original_basename=.7tap8dru follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:34:45 np0005555140 python3.9[59552]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:34:45 np0005555140 systemd[1]: Reloading Network Manager...
Dec 11 04:34:45 np0005555140 NetworkManager[55531]: <info>  [1765445685.2266] audit: op="reload" arg="0" pid=59556 uid=0 result="success"
Dec 11 04:34:45 np0005555140 NetworkManager[55531]: <info>  [1765445685.2272] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 11 04:34:45 np0005555140 systemd[1]: Reloaded Network Manager.
Dec 11 04:34:45 np0005555140 systemd[1]: session-11.scope: Deactivated successfully.
Dec 11 04:34:45 np0005555140 systemd[1]: session-11.scope: Consumed 50.002s CPU time.
Dec 11 04:34:45 np0005555140 systemd-logind[787]: Session 11 logged out. Waiting for processes to exit.
Dec 11 04:34:45 np0005555140 systemd-logind[787]: Removed session 11.
Dec 11 04:34:52 np0005555140 systemd-logind[787]: New session 12 of user zuul.
Dec 11 04:34:52 np0005555140 systemd[1]: Started Session 12 of User zuul.
Dec 11 04:34:53 np0005555140 python3.9[59740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:34:53 np0005555140 python3.9[59894]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:34:55 np0005555140 python3.9[60084]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:34:55 np0005555140 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 04:34:55 np0005555140 systemd[1]: session-12.scope: Deactivated successfully.
Dec 11 04:34:55 np0005555140 systemd[1]: session-12.scope: Consumed 2.251s CPU time.
Dec 11 04:34:55 np0005555140 systemd-logind[787]: Session 12 logged out. Waiting for processes to exit.
Dec 11 04:34:55 np0005555140 systemd-logind[787]: Removed session 12.
Dec 11 04:35:00 np0005555140 systemd-logind[787]: New session 13 of user zuul.
Dec 11 04:35:00 np0005555140 systemd[1]: Started Session 13 of User zuul.
Dec 11 04:35:01 np0005555140 python3.9[60266]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:35:02 np0005555140 python3.9[60420]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:35:03 np0005555140 python3.9[60576]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:35:04 np0005555140 python3.9[60661]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:35:06 np0005555140 python3.9[60814]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:35:07 np0005555140 python3.9[61006]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:08 np0005555140 python3.9[61158]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:35:08 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:35:09 np0005555140 python3.9[61321]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:09 np0005555140 python3.9[61399]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:10 np0005555140 python3.9[61551]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:10 np0005555140 python3.9[61629]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:11 np0005555140 python3.9[61781]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:12 np0005555140 python3.9[61933]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:12 np0005555140 python3.9[62085]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:13 np0005555140 python3.9[62237]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:14 np0005555140 python3.9[62389]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:35:16 np0005555140 python3.9[62542]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:35:17 np0005555140 python3.9[62696]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:35:18 np0005555140 python3.9[62848]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:35:18 np0005555140 python3.9[63000]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:35:19 np0005555140 python3.9[63153]: ansible-service_facts Invoked
Dec 11 04:35:19 np0005555140 network[63170]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:35:19 np0005555140 network[63171]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:35:19 np0005555140 network[63172]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:35:24 np0005555140 python3.9[63624]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:35:26 np0005555140 python3.9[63777]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 11 04:35:27 np0005555140 python3.9[63929]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:28 np0005555140 python3.9[64054]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445727.2142348-232-146000462632606/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:29 np0005555140 python3.9[64208]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:29 np0005555140 python3.9[64333]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445728.728502-247-132167499694033/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:30 np0005555140 python3.9[64487]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:31 np0005555140 python3.9[64641]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:35:32 np0005555140 python3.9[64725]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:35:33 np0005555140 python3.9[64879]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:35:34 np0005555140 python3.9[64963]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:35:34 np0005555140 chronyd[795]: chronyd exiting
Dec 11 04:35:34 np0005555140 systemd[1]: Stopping NTP client/server...
Dec 11 04:35:34 np0005555140 systemd[1]: chronyd.service: Deactivated successfully.
Dec 11 04:35:34 np0005555140 systemd[1]: Stopped NTP client/server.
Dec 11 04:35:34 np0005555140 systemd[1]: Starting NTP client/server...
Dec 11 04:35:34 np0005555140 chronyd[64972]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 11 04:35:34 np0005555140 chronyd[64972]: Frequency -28.558 +/- 0.342 ppm read from /var/lib/chrony/drift
Dec 11 04:35:34 np0005555140 chronyd[64972]: Loaded seccomp filter (level 2)
Dec 11 04:35:34 np0005555140 systemd[1]: Started NTP client/server.
Dec 11 04:35:35 np0005555140 systemd[1]: session-13.scope: Deactivated successfully.
Dec 11 04:35:35 np0005555140 systemd[1]: session-13.scope: Consumed 25.664s CPU time.
Dec 11 04:35:35 np0005555140 systemd-logind[787]: Session 13 logged out. Waiting for processes to exit.
Dec 11 04:35:35 np0005555140 systemd-logind[787]: Removed session 13.
Dec 11 04:35:40 np0005555140 systemd-logind[787]: New session 14 of user zuul.
Dec 11 04:35:40 np0005555140 systemd[1]: Started Session 14 of User zuul.
Dec 11 04:35:41 np0005555140 python3.9[65151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:35:42 np0005555140 python3.9[65307]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:43 np0005555140 python3.9[65482]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:44 np0005555140 python3.9[65560]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ua8s9att recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:44 np0005555140 python3.9[65712]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:45 np0005555140 python3.9[65835]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445744.5388262-61-174887212347736/.source _original_basename=.8bilp6rl follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:46 np0005555140 python3.9[65987]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:47 np0005555140 python3.9[66139]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:47 np0005555140 python3.9[66262]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445746.5548308-85-148696118243407/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:48 np0005555140 python3.9[66414]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:48 np0005555140 python3.9[66537]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445747.69221-85-105239824177728/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:35:49 np0005555140 python3.9[66689]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:50 np0005555140 python3.9[66841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:50 np0005555140 python3.9[66964]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445749.7576363-122-159558369510121/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:51 np0005555140 python3.9[67116]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:52 np0005555140 python3.9[67239]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445750.99829-137-201757276836095/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:53 np0005555140 python3.9[67392]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:35:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:35:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:35:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:35:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:35:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:35:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:35:53 np0005555140 systemd[1]: Starting EDPM Container Shutdown...
Dec 11 04:35:53 np0005555140 systemd[1]: Finished EDPM Container Shutdown.
Dec 11 04:35:54 np0005555140 python3.9[67621]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:54 np0005555140 python3.9[67744]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445753.955564-160-7736520561149/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:55 np0005555140 python3.9[67896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:35:56 np0005555140 python3.9[68019]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445755.1080928-175-80336153427947/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:35:56 np0005555140 python3.9[68171]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:35:56 np0005555140 systemd[1]: Reloading.
Dec 11 04:35:57 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:35:57 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:35:57 np0005555140 systemd[1]: Reloading.
Dec 11 04:35:57 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:35:57 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:35:57 np0005555140 systemd[1]: Starting Create netns directory...
Dec 11 04:35:57 np0005555140 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 04:35:57 np0005555140 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 04:35:57 np0005555140 systemd[1]: Finished Create netns directory.
Dec 11 04:35:58 np0005555140 python3.9[68397]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:35:58 np0005555140 network[68414]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:35:58 np0005555140 network[68415]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:35:58 np0005555140 network[68416]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:36:01 np0005555140 python3.9[68678]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:36:02 np0005555140 systemd[1]: Reloading.
Dec 11 04:36:02 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:36:02 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:36:02 np0005555140 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 11 04:36:02 np0005555140 iptables.init[68719]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 11 04:36:02 np0005555140 iptables.init[68719]: iptables: Flushing firewall rules: [  OK  ]
Dec 11 04:36:02 np0005555140 systemd[1]: iptables.service: Deactivated successfully.
Dec 11 04:36:02 np0005555140 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 11 04:36:03 np0005555140 python3.9[68915]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:36:04 np0005555140 python3.9[69069]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:36:04 np0005555140 systemd[1]: Reloading.
Dec 11 04:36:04 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:36:04 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:36:04 np0005555140 systemd[1]: Starting Netfilter Tables...
Dec 11 04:36:04 np0005555140 systemd[1]: Finished Netfilter Tables.
Dec 11 04:36:05 np0005555140 python3.9[69260]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:06 np0005555140 python3.9[69413]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:06 np0005555140 python3.9[69538]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445765.683376-244-90629748909853/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:07 np0005555140 python3.9[69691]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:36:07 np0005555140 systemd[1]: Reloading OpenSSH server daemon...
Dec 11 04:36:07 np0005555140 systemd[1]: Reloaded OpenSSH server daemon.
Dec 11 04:36:08 np0005555140 python3.9[69847]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:08 np0005555140 python3.9[69999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:09 np0005555140 python3.9[70122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445768.3089564-275-121144033739311/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:10 np0005555140 python3.9[70274]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 11 04:36:10 np0005555140 systemd[1]: Starting Time & Date Service...
Dec 11 04:36:10 np0005555140 systemd[1]: Started Time & Date Service.
Dec 11 04:36:11 np0005555140 python3.9[70430]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:11 np0005555140 python3.9[70582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:12 np0005555140 python3.9[70705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445771.2057502-310-115196235289494/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:12 np0005555140 python3.9[70857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:13 np0005555140 python3.9[70980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445772.346161-325-65104819487323/.source.yaml _original_basename=.k5ce2ti9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:13 np0005555140 python3.9[71132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:14 np0005555140 python3.9[71255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445773.4811044-340-183992841244464/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:15 np0005555140 python3.9[71407]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:15 np0005555140 python3.9[71560]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:16 np0005555140 python3[71713]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 04:36:17 np0005555140 python3.9[71865]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:18 np0005555140 python3.9[71988]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445776.9733114-379-114101084972424/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:18 np0005555140 python3.9[72140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:19 np0005555140 python3.9[72263]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445778.21287-394-46782783086031/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:19 np0005555140 python3.9[72415]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:20 np0005555140 python3.9[72538]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445779.3890061-409-253432589231927/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:21 np0005555140 python3.9[72690]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:21 np0005555140 python3.9[72813]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445780.5903144-424-247023795807979/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:22 np0005555140 python3.9[72965]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:36:22 np0005555140 python3.9[73088]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445781.8414254-439-121159437251905/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:23 np0005555140 python3.9[73240]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:24 np0005555140 python3.9[73392]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:25 np0005555140 python3.9[73551]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:25 np0005555140 python3.9[73704]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:26 np0005555140 python3.9[73856]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:27 np0005555140 python3.9[74008]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 11 04:36:27 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:36:27 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:36:28 np0005555140 python3.9[74162]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 11 04:36:28 np0005555140 systemd[1]: session-14.scope: Deactivated successfully.
Dec 11 04:36:28 np0005555140 systemd[1]: session-14.scope: Consumed 35.728s CPU time.
Dec 11 04:36:28 np0005555140 systemd-logind[787]: Session 14 logged out. Waiting for processes to exit.
Dec 11 04:36:28 np0005555140 systemd-logind[787]: Removed session 14.
Dec 11 04:36:33 np0005555140 systemd-logind[787]: New session 15 of user zuul.
Dec 11 04:36:33 np0005555140 systemd[1]: Started Session 15 of User zuul.
Dec 11 04:36:34 np0005555140 python3.9[74343]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 11 04:36:35 np0005555140 python3.9[74495]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:36:36 np0005555140 python3.9[74647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:36:37 np0005555140 python3.9[74799]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkWXJ8xuxA4h0DREHZ1ogD2WWBAWWm773/9KJwBgPIMjMOYDMXK4GSLKkuMir1J3W1YPk7K5k6pxKKvQFkB7O2ki3/IWGQhAvU7Cq2d9StYUtY4YaruuHphAdrWg1+g7xyrUswgBucjVUVklu+x5NIEIJul4QkwItVjCWgbKAaOhWEtEjiHrFvu+G/neEZxT/Ku5WEXde3EoPzM8xrybnImWjfWzitm84Rc1cbDtR4BbT6fRnMV1mj5EUfSftDVgTlTHsot88vjMABSYaG96//zaXz9b3RVAzFJyZ7me7D0EDKbsJBEndLlmmAKbRG625F8iqHaynJQrNxBMjjyS8prccJ2OPaTbA3+gH6aVj6r//APZP4A+suVx3MaCioq36Nao/M5gOE2UjFAbdxWm/Xy1RPnH4DqdbR2uAXhQrUW1231+01eB8GiPJncQgji5ka7IBtz2COYcebyirNdq23Yw54ajb6RwqasjCDzSdLYC3q2MwnLaf+17BSDxGAckE=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINe8jneqr/DuuN5pj6FX2f4XZs1J4xEIsj28xIb2bWzE#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKShJIo+G8hn8NKrEg4XJjRAppKLRldYAl8LY/gX1E/j9g/l+cFFGhwZtIUkZae2d3QRq0JTZYfKntK5yZd+MOE=#012 create=True mode=0644 path=/tmp/ansible.qaoagksq state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:38 np0005555140 python3.9[74951]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.qaoagksq' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:38 np0005555140 python3.9[75105]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.qaoagksq state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:39 np0005555140 systemd[1]: session-15.scope: Deactivated successfully.
Dec 11 04:36:39 np0005555140 systemd[1]: session-15.scope: Consumed 3.569s CPU time.
Dec 11 04:36:39 np0005555140 systemd-logind[787]: Session 15 logged out. Waiting for processes to exit.
Dec 11 04:36:39 np0005555140 systemd-logind[787]: Removed session 15.
Dec 11 04:36:40 np0005555140 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 04:36:44 np0005555140 systemd-logind[787]: New session 16 of user zuul.
Dec 11 04:36:44 np0005555140 systemd[1]: Started Session 16 of User zuul.
Dec 11 04:36:45 np0005555140 python3.9[75285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:36:46 np0005555140 python3.9[75441]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 11 04:36:47 np0005555140 python3.9[75595]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:36:48 np0005555140 python3.9[75748]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:49 np0005555140 python3.9[75901]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:36:50 np0005555140 python3.9[76055]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:36:52 np0005555140 python3.9[76210]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:36:52 np0005555140 systemd[1]: session-16.scope: Deactivated successfully.
Dec 11 04:36:52 np0005555140 systemd[1]: session-16.scope: Consumed 4.476s CPU time.
Dec 11 04:36:52 np0005555140 systemd-logind[787]: Session 16 logged out. Waiting for processes to exit.
Dec 11 04:36:52 np0005555140 systemd-logind[787]: Removed session 16.
Dec 11 04:36:58 np0005555140 systemd-logind[787]: New session 17 of user zuul.
Dec 11 04:36:58 np0005555140 systemd[1]: Started Session 17 of User zuul.
Dec 11 04:36:59 np0005555140 python3.9[76388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:37:00 np0005555140 python3.9[76544]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:37:01 np0005555140 python3.9[76628]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 04:37:03 np0005555140 python3.9[76779]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:37:05 np0005555140 python3.9[76930]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:37:05 np0005555140 python3.9[77080]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:37:06 np0005555140 python3.9[77230]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:37:06 np0005555140 systemd[1]: session-17.scope: Deactivated successfully.
Dec 11 04:37:06 np0005555140 systemd[1]: session-17.scope: Consumed 6.045s CPU time.
Dec 11 04:37:06 np0005555140 systemd-logind[787]: Session 17 logged out. Waiting for processes to exit.
Dec 11 04:37:06 np0005555140 systemd-logind[787]: Removed session 17.
Dec 11 04:37:12 np0005555140 systemd-logind[787]: New session 18 of user zuul.
Dec 11 04:37:12 np0005555140 systemd[1]: Started Session 18 of User zuul.
Dec 11 04:37:13 np0005555140 python3.9[77408]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:37:14 np0005555140 python3.9[77564]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:15 np0005555140 python3.9[77716]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:16 np0005555140 python3.9[77868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:17 np0005555140 python3.9[77991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445835.7567773-65-37641783636310/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bee8309068f601a4d3a03dc388eed1909700a101 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:17 np0005555140 python3.9[78143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:18 np0005555140 python3.9[78266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445837.425306-65-185495989030738/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7043162ced2f904660321981d55b292f171eb2e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:19 np0005555140 python3.9[78418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:19 np0005555140 python3.9[78541]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445838.6928704-65-244923086036420/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=51decb7741bd251928612b97fac29d364b87107f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:20 np0005555140 python3.9[78693]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:21 np0005555140 python3.9[78845]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:21 np0005555140 python3.9[78997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:22 np0005555140 python3.9[79120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445841.3826478-124-176212524412359/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=1dff163a49d8b440b71f83fbfbcced70210ddae6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:23 np0005555140 python3.9[79272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:23 np0005555140 python3.9[79395]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445842.5593374-124-202573718920254/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f02dddad1778ef20fd44dfd02785a63876f71d4d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:24 np0005555140 python3.9[79547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:24 np0005555140 python3.9[79670]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445843.7492924-124-73266729459166/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b3a2929ed9c20bdc7696fd104bb9452dfc8f66cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:25 np0005555140 python3.9[79822]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:26 np0005555140 python3.9[79974]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:26 np0005555140 python3.9[80126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:27 np0005555140 python3.9[80249]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445846.4170747-183-38374183317870/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=9f34ff48f25b551a9d1515778f095fbca43a7b71 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:27 np0005555140 python3.9[80401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:28 np0005555140 python3.9[80524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445847.5419464-183-86974243862971/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=989bbc91156f266924c19b8e7e20ca8395592834 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:29 np0005555140 python3.9[80676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:29 np0005555140 python3.9[80799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445848.6131396-183-55042873918470/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ba0840f0fccebae892ef7a4e245ca67cdc2e71f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:30 np0005555140 python3.9[80951]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:31 np0005555140 python3.9[81103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:31 np0005555140 python3.9[81255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:32 np0005555140 python3.9[81378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445851.3334596-242-138333291545362/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=14da928b2792a10431dff1b919869ce2f2f319f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:32 np0005555140 python3.9[81530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:33 np0005555140 python3.9[81653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445852.479335-242-64198514286647/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=989bbc91156f266924c19b8e7e20ca8395592834 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:34 np0005555140 python3.9[81805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:34 np0005555140 python3.9[81928]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445853.6461382-242-92864867457807/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a0bf4a97ea8d5042cb494e2ed6eed618628e49b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:35 np0005555140 python3.9[82080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:36 np0005555140 python3.9[82232]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:37 np0005555140 python3.9[82355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445856.0202904-310-136463515012421/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:37 np0005555140 python3.9[82507]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:38 np0005555140 python3.9[82659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:39 np0005555140 python3.9[82782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445858.0634713-334-41287525179926/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:39 np0005555140 python3.9[82934]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:40 np0005555140 python3.9[83086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:41 np0005555140 python3.9[83209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445860.173017-358-269937093591099/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:41 np0005555140 python3.9[83361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:42 np0005555140 python3.9[83513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:43 np0005555140 python3.9[83636]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445862.024388-382-212655302082941/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:43 np0005555140 python3.9[83788]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:44 np0005555140 python3.9[83940]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:45 np0005555140 chronyd[64972]: Selected source 167.160.187.12 (pool.ntp.org)
Dec 11 04:37:45 np0005555140 python3.9[84063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445864.244299-406-166313640962714/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:46 np0005555140 python3.9[84215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:46 np0005555140 python3.9[84367]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:47 np0005555140 python3.9[84490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445866.1917973-430-140042474083552/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:47 np0005555140 python3.9[84642]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:48 np0005555140 python3.9[84794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:37:49 np0005555140 python3.9[84917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445868.0688772-454-7990208497480/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=76f2f55ba4f8b77eaf270f62b3cc490fa1fec564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:37:49 np0005555140 systemd[1]: session-18.scope: Deactivated successfully.
Dec 11 04:37:49 np0005555140 systemd[1]: session-18.scope: Consumed 29.397s CPU time.
Dec 11 04:37:49 np0005555140 systemd-logind[787]: Session 18 logged out. Waiting for processes to exit.
Dec 11 04:37:49 np0005555140 systemd-logind[787]: Removed session 18.
Dec 11 04:37:55 np0005555140 systemd-logind[787]: New session 19 of user zuul.
Dec 11 04:37:55 np0005555140 systemd[1]: Started Session 19 of User zuul.
Dec 11 04:37:56 np0005555140 python3.9[85096]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:37:57 np0005555140 python3.9[85252]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:58 np0005555140 python3.9[85404]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:37:59 np0005555140 python3.9[85554]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:38:00 np0005555140 python3.9[85706]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 11 04:38:01 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 11 04:38:02 np0005555140 python3.9[85862]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:38:02 np0005555140 python3.9[85946]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:38:05 np0005555140 python3.9[86099]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:38:06 np0005555140 python3[86254]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 11 04:38:06 np0005555140 python3.9[86406]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:07 np0005555140 python3.9[86558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:08 np0005555140 python3.9[86636]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:08 np0005555140 python3.9[86788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:09 np0005555140 python3.9[86866]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x_ft6lnm recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:09 np0005555140 python3.9[87018]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:10 np0005555140 python3.9[87096]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:11 np0005555140 python3.9[87248]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:12 np0005555140 python3[87401]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 04:38:12 np0005555140 python3.9[87553]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:13 np0005555140 python3.9[87678]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445892.2272518-157-18545325395609/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:14 np0005555140 python3.9[87830]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:14 np0005555140 python3.9[87955]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445893.632538-172-260506593627804/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:15 np0005555140 python3.9[88107]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:16 np0005555140 python3.9[88232]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445895.0024683-187-279274447777429/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:16 np0005555140 python3.9[88384]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:17 np0005555140 python3.9[88509]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445896.227735-202-245226919343415/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:17 np0005555140 python3.9[88661]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:18 np0005555140 python3.9[88786]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765445897.461869-217-231034446251032/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:19 np0005555140 python3.9[88938]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:19 np0005555140 python3.9[89090]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:20 np0005555140 python3.9[89245]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:21 np0005555140 python3.9[89397]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:22 np0005555140 python3.9[89550]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:38:22 np0005555140 python3.9[89704]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:23 np0005555140 python3.9[89859]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:24 np0005555140 python3.9[90009]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:38:25 np0005555140 python3.9[90162]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:cb:58:d7:dd" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:25 np0005555140 ovs-vsctl[90163]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:cb:58:d7:dd external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 11 04:38:26 np0005555140 python3.9[90315]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:26 np0005555140 python3.9[90470]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:26 np0005555140 ovs-vsctl[90471]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 11 04:38:27 np0005555140 python3.9[90621]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:38:28 np0005555140 python3.9[90775]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:28 np0005555140 python3.9[90927]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:29 np0005555140 python3.9[91005]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:29 np0005555140 python3.9[91157]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:30 np0005555140 python3.9[91235]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:30 np0005555140 python3.9[91387]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:31 np0005555140 python3.9[91539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:31 np0005555140 python3.9[91617]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:32 np0005555140 python3.9[91769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:32 np0005555140 python3.9[91847]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:33 np0005555140 python3.9[91999]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:38:33 np0005555140 systemd[1]: Reloading.
Dec 11 04:38:33 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:38:33 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:38:34 np0005555140 python3.9[92187]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:34 np0005555140 python3.9[92265]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:35 np0005555140 python3.9[92417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:35 np0005555140 python3.9[92495]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:36 np0005555140 python3.9[92647]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:38:36 np0005555140 systemd[1]: Reloading.
Dec 11 04:38:36 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:38:36 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:38:37 np0005555140 systemd[1]: Starting Create netns directory...
Dec 11 04:38:37 np0005555140 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 04:38:37 np0005555140 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 04:38:37 np0005555140 systemd[1]: Finished Create netns directory.
Dec 11 04:38:38 np0005555140 python3.9[92841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:39 np0005555140 python3.9[92993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:40 np0005555140 python3.9[93116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445918.9549236-468-243985564652065/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:40 np0005555140 python3.9[93268]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:38:41 np0005555140 python3.9[93420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:38:42 np0005555140 python3.9[93543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445921.0580502-493-33850289018125/.source.json _original_basename=.8lcsuxun follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:42 np0005555140 python3.9[93695]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:45 np0005555140 python3.9[94122]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 11 04:38:45 np0005555140 python3.9[94274]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:38:46 np0005555140 python3.9[94426]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 04:38:46 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:38:47 np0005555140 python3[94590]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:38:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:38:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:38:48 np0005555140 podman[94625]: 2025-12-11 09:38:48.178203064 +0000 UTC m=+0.054386774 container create 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec 11 04:38:48 np0005555140 podman[94625]: 2025-12-11 09:38:48.1507585 +0000 UTC m=+0.026942250 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Dec 11 04:38:48 np0005555140 python3[94590]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Dec 11 04:38:48 np0005555140 python3.9[94813]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:38:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 04:38:49 np0005555140 python3.9[94967]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:50 np0005555140 python3.9[95043]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:38:50 np0005555140 python3.9[95194]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765445930.099379-581-135649633854758/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:38:52 np0005555140 python3.9[95270]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:38:52 np0005555140 systemd[1]: Reloading.
Dec 11 04:38:52 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:38:52 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:38:53 np0005555140 python3.9[95380]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:38:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:38:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:38:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:38:53 np0005555140 systemd[1]: Starting ovn_controller container...
Dec 11 04:38:53 np0005555140 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 11 04:38:53 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:38:53 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc0fc91ef68376e71802527bd8958c23a3daa0ee4d2496edc1e93ec9dbb1f0f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 04:38:53 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659.
Dec 11 04:38:53 np0005555140 podman[95421]: 2025-12-11 09:38:53.643561806 +0000 UTC m=+0.152214542 container init 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 11 04:38:53 np0005555140 ovn_controller[95438]: + sudo -E kolla_set_configs
Dec 11 04:38:53 np0005555140 podman[95421]: 2025-12-11 09:38:53.669291477 +0000 UTC m=+0.177944203 container start 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Dec 11 04:38:53 np0005555140 edpm-start-podman-container[95421]: ovn_controller
Dec 11 04:38:53 np0005555140 systemd[1]: Created slice User Slice of UID 0.
Dec 11 04:38:53 np0005555140 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 11 04:38:53 np0005555140 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 11 04:38:53 np0005555140 systemd[1]: Starting User Manager for UID 0...
Dec 11 04:38:53 np0005555140 edpm-start-podman-container[95420]: Creating additional drop-in dependency for "ovn_controller" (7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659)
Dec 11 04:38:53 np0005555140 podman[95445]: 2025-12-11 09:38:53.736998353 +0000 UTC m=+0.057713930 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 04:38:53 np0005555140 systemd[1]: 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659-1452b9648c26f6d1.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:38:53 np0005555140 systemd[1]: 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659-1452b9648c26f6d1.service: Failed with result 'exit-code'.
Dec 11 04:38:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:38:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:38:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:38:53 np0005555140 systemd[95467]: Queued start job for default target Main User Target.
Dec 11 04:38:53 np0005555140 systemd[95467]: Created slice User Application Slice.
Dec 11 04:38:53 np0005555140 systemd[95467]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 11 04:38:53 np0005555140 systemd[95467]: Started Daily Cleanup of User's Temporary Directories.
Dec 11 04:38:53 np0005555140 systemd[95467]: Reached target Paths.
Dec 11 04:38:53 np0005555140 systemd[95467]: Reached target Timers.
Dec 11 04:38:53 np0005555140 systemd[95467]: Starting D-Bus User Message Bus Socket...
Dec 11 04:38:53 np0005555140 systemd[95467]: Starting Create User's Volatile Files and Directories...
Dec 11 04:38:53 np0005555140 systemd[95467]: Finished Create User's Volatile Files and Directories.
Dec 11 04:38:53 np0005555140 systemd[95467]: Listening on D-Bus User Message Bus Socket.
Dec 11 04:38:53 np0005555140 systemd[95467]: Reached target Sockets.
Dec 11 04:38:53 np0005555140 systemd[95467]: Reached target Basic System.
Dec 11 04:38:53 np0005555140 systemd[95467]: Reached target Main User Target.
Dec 11 04:38:53 np0005555140 systemd[95467]: Startup finished in 138ms.
Dec 11 04:38:53 np0005555140 systemd[1]: Started User Manager for UID 0.
Dec 11 04:38:53 np0005555140 systemd[1]: Started ovn_controller container.
Dec 11 04:38:53 np0005555140 systemd[1]: Started Session c1 of User root.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: INFO:__main__:Validating config file
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: INFO:__main__:Writing out command to execute
Dec 11 04:38:54 np0005555140 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: ++ cat /run_command
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + ARGS=
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + sudo kolla_copy_cacerts
Dec 11 04:38:54 np0005555140 systemd[1]: Started Session c2 of User root.
Dec 11 04:38:54 np0005555140 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + [[ ! -n '' ]]
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + . kolla_extend_start
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + umask 0022
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1321] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1327] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <warn>  [1765445934.1330] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1335] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1340] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1343] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 04:38:54 np0005555140 kernel: br-int: entered promiscuous mode
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 04:38:54 np0005555140 ovn_controller[95438]: 2025-12-11T09:38:54Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1571] manager: (ovn-3746d4-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 11 04:38:54 np0005555140 systemd-udevd[95583]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:38:54 np0005555140 kernel: genev_sys_6081: entered promiscuous mode
Dec 11 04:38:54 np0005555140 systemd-udevd[95584]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1770] device (genev_sys_6081): carrier: link connected
Dec 11 04:38:54 np0005555140 NetworkManager[55531]: <info>  [1765445934.1773] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 11 04:38:54 np0005555140 python3.9[95698]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:54 np0005555140 ovs-vsctl[95699]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 11 04:38:55 np0005555140 python3.9[95851]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:55 np0005555140 ovs-vsctl[95853]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 11 04:38:56 np0005555140 python3.9[96006]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:38:56 np0005555140 ovs-vsctl[96007]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 11 04:38:56 np0005555140 systemd-logind[787]: Session 19 logged out. Waiting for processes to exit.
Dec 11 04:38:56 np0005555140 systemd[1]: session-19.scope: Deactivated successfully.
Dec 11 04:38:56 np0005555140 systemd[1]: session-19.scope: Consumed 44.455s CPU time.
Dec 11 04:38:56 np0005555140 systemd-logind[787]: Removed session 19.
Dec 11 04:39:03 np0005555140 systemd-logind[787]: New session 21 of user zuul.
Dec 11 04:39:03 np0005555140 systemd[1]: Started Session 21 of User zuul.
Dec 11 04:39:04 np0005555140 python3.9[96185]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:39:04 np0005555140 systemd[1]: Stopping User Manager for UID 0...
Dec 11 04:39:04 np0005555140 systemd[95467]: Activating special unit Exit the Session...
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped target Main User Target.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped target Basic System.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped target Paths.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped target Sockets.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped target Timers.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 11 04:39:04 np0005555140 systemd[95467]: Closed D-Bus User Message Bus Socket.
Dec 11 04:39:04 np0005555140 systemd[95467]: Stopped Create User's Volatile Files and Directories.
Dec 11 04:39:04 np0005555140 systemd[95467]: Removed slice User Application Slice.
Dec 11 04:39:04 np0005555140 systemd[95467]: Reached target Shutdown.
Dec 11 04:39:04 np0005555140 systemd[95467]: Finished Exit the Session.
Dec 11 04:39:04 np0005555140 systemd[95467]: Reached target Exit the Session.
Dec 11 04:39:04 np0005555140 systemd[1]: user@0.service: Deactivated successfully.
Dec 11 04:39:04 np0005555140 systemd[1]: Stopped User Manager for UID 0.
Dec 11 04:39:04 np0005555140 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 11 04:39:04 np0005555140 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 11 04:39:04 np0005555140 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 11 04:39:04 np0005555140 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 11 04:39:04 np0005555140 systemd[1]: Removed slice User Slice of UID 0.
Dec 11 04:39:05 np0005555140 python3.9[96344]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:05 np0005555140 python3.9[96496]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:06 np0005555140 python3.9[96648]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:07 np0005555140 python3.9[96800]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:07 np0005555140 python3.9[96952]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:08 np0005555140 python3.9[97102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:39:09 np0005555140 python3.9[97254]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 11 04:39:10 np0005555140 python3.9[97404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:11 np0005555140 python3.9[97525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445950.1576507-86-145452046373713/.source follow=False _original_basename=haproxy.j2 checksum=d225e0e1c34f765c55f17e757e326dba55238d01 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:12 np0005555140 python3.9[97675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:12 np0005555140 python3.9[97797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445951.6758342-101-241861499710064/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:13 np0005555140 python3.9[97949]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:39:14 np0005555140 python3.9[98033]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:39:16 np0005555140 python3.9[98186]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:39:17 np0005555140 python3.9[98339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:17 np0005555140 python3.9[98460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445956.8790252-138-124408179797055/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:18 np0005555140 python3.9[98610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:19 np0005555140 python3.9[98731]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445957.9939222-138-130210667653289/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:20 np0005555140 python3.9[98881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:20 np0005555140 python3.9[99002]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445959.7608302-182-259168253102964/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:21 np0005555140 python3.9[99152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:21 np0005555140 python3.9[99273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445960.7834935-182-165057817433421/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:22 np0005555140 python3.9[99423]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:39:23 np0005555140 python3.9[99577]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:23 np0005555140 python3.9[99729]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:23 np0005555140 ovn_controller[95438]: 2025-12-11T09:39:23Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Dec 11 04:39:23 np0005555140 ovn_controller[95438]: 2025-12-11T09:39:23Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 11 04:39:23 np0005555140 podman[99779]: 2025-12-11 09:39:23.936654448 +0000 UTC m=+0.092819193 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:39:24 np0005555140 python3.9[99824]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:24 np0005555140 python3.9[99985]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:25 np0005555140 python3.9[100063]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:25 np0005555140 python3.9[100215]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:26 np0005555140 python3.9[100367]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:26 np0005555140 python3.9[100445]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:27 np0005555140 python3.9[100597]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:28 np0005555140 python3.9[100675]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:28 np0005555140 python3.9[100827]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:39:28 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:28 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:28 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:29 np0005555140 python3.9[101016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:30 np0005555140 python3.9[101094]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:30 np0005555140 python3.9[101246]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:31 np0005555140 python3.9[101324]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:32 np0005555140 python3.9[101476]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:39:32 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:32 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:32 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:32 np0005555140 systemd[1]: Starting Create netns directory...
Dec 11 04:39:32 np0005555140 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 04:39:32 np0005555140 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 04:39:32 np0005555140 systemd[1]: Finished Create netns directory.
Dec 11 04:39:33 np0005555140 python3.9[101670]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:33 np0005555140 python3.9[101822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:34 np0005555140 python3.9[101945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765445973.2481542-333-80626799926043/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:35 np0005555140 python3.9[102097]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:39:35 np0005555140 python3.9[102249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:39:36 np0005555140 python3.9[102372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765445975.321415-358-252342237922401/.source.json _original_basename=.d8r5g_kp follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:36 np0005555140 python3.9[102524]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:39 np0005555140 python3.9[102951]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 11 04:39:39 np0005555140 python3.9[103104]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:39:40 np0005555140 python3.9[103256]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 04:39:42 np0005555140 python3[103433]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:39:42 np0005555140 podman[103469]: 2025-12-11 09:39:42.282582719 +0000 UTC m=+0.049191918 container create 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 11 04:39:42 np0005555140 podman[103469]: 2025-12-11 09:39:42.253139261 +0000 UTC m=+0.019748470 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:39:42 np0005555140 python3[103433]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:39:42 np0005555140 python3.9[103658]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:39:43 np0005555140 python3.9[103812]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:44 np0005555140 python3.9[103888]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:39:44 np0005555140 python3.9[104039]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765445984.1643288-446-87065512169834/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:39:45 np0005555140 python3.9[104115]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:39:45 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:45 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:45 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:46 np0005555140 python3.9[104226]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:39:46 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:46 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:46 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:46 np0005555140 systemd[1]: Starting ovn_metadata_agent container...
Dec 11 04:39:46 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:39:46 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29cf94b953c5fddc8ec86419b90b8e982fd9c967355cf52a0596f88ab23df077/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 11 04:39:46 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29cf94b953c5fddc8ec86419b90b8e982fd9c967355cf52a0596f88ab23df077/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:39:46 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483.
Dec 11 04:39:46 np0005555140 podman[104268]: 2025-12-11 09:39:46.646498206 +0000 UTC m=+0.118004811 container init 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + sudo -E kolla_set_configs
Dec 11 04:39:46 np0005555140 podman[104268]: 2025-12-11 09:39:46.677269336 +0000 UTC m=+0.148775951 container start 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 04:39:46 np0005555140 edpm-start-podman-container[104268]: ovn_metadata_agent
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Validating config file
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Copying service configuration files
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Writing out command to execute
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: ++ cat /run_command
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + CMD=neutron-ovn-metadata-agent
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + ARGS=
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + sudo kolla_copy_cacerts
Dec 11 04:39:46 np0005555140 edpm-start-podman-container[104267]: Creating additional drop-in dependency for "ovn_metadata_agent" (9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483)
Dec 11 04:39:46 np0005555140 podman[104290]: 2025-12-11 09:39:46.746693253 +0000 UTC m=+0.058588631 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + [[ ! -n '' ]]
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + . kolla_extend_start
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: Running command: 'neutron-ovn-metadata-agent'
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + umask 0022
Dec 11 04:39:46 np0005555140 ovn_metadata_agent[104282]: + exec neutron-ovn-metadata-agent
Dec 11 04:39:46 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:46 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:46 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:46 np0005555140 systemd[1]: Started ovn_metadata_agent container.
Dec 11 04:39:47 np0005555140 systemd[1]: session-21.scope: Deactivated successfully.
Dec 11 04:39:47 np0005555140 systemd[1]: session-21.scope: Consumed 33.620s CPU time.
Dec 11 04:39:47 np0005555140 systemd-logind[787]: Session 21 logged out. Waiting for processes to exit.
Dec 11 04:39:47 np0005555140 systemd-logind[787]: Removed session 21.
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.554 104288 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.555 104288 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.555 104288 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.555 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.555 104288 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.555 104288 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.556 104288 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.557 104288 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.558 104288 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.559 104288 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.560 104288 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.561 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.562 104288 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.563 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.564 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.565 104288 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.566 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.567 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.568 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.569 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.570 104288 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.571 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.572 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.573 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.574 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.575 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.576 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.577 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.578 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.579 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.580 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.581 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.582 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.583 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.584 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.585 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.586 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.587 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.588 104288 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.589 104288 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.598 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.598 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.599 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.599 104288 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.599 104288 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.611 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2f07ba53-a431-4669-9e8c-dcf2fed72095 (UUID: 2f07ba53-a431-4669-9e8c-dcf2fed72095) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.639 104288 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.639 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.639 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.640 104288 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.647 104288 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.659 104288 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.664 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2f07ba53-a431-4669-9e8c-dcf2fed72095'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], external_ids={}, name=2f07ba53-a431-4669-9e8c-dcf2fed72095, nb_cfg_timestamp=1765445942157, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.665 104288 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fce7d2d80d0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.665 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.666 104288 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.666 104288 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.666 104288 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.670 104288 DEBUG oslo_service.service [-] Started child 104397 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.673 104288 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpt932775a/privsep.sock']#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.673 104397 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-493861'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.695 104397 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.695 104397 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.695 104397 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.699 104397 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.705 104397 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec 11 04:39:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:48.711 104397 INFO eventlet.wsgi.server [-] (104397) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec 11 04:39:49 np0005555140 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.346 104288 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.347 104288 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpt932775a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.210 104402 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.214 104402 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.216 104402 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.216 104402 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104402#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.349 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[eff83e5e-5161-410f-9b8a-7a3cf0831e31]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.842 104402 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.843 104402 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:39:49 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:49.843 104402 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.382 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[53d1d4bc-a50d-4e3f-a297-d703bc9dad2d]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.385 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, column=external_ids, values=({'neutron:ovn-metadata-id': 'fe982631-43f0-500e-b723-56e7cfe951ac'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.423 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.429 104288 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.429 104288 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.429 104288 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.430 104288 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.431 104288 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.432 104288 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.433 104288 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.434 104288 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.435 104288 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.436 104288 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.437 104288 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.438 104288 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.439 104288 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.440 104288 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.441 104288 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.442 104288 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.443 104288 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.444 104288 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.445 104288 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.446 104288 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.447 104288 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.448 104288 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.449 104288 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.450 104288 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.451 104288 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.452 104288 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.453 104288 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.454 104288 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.455 104288 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.456 104288 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.457 104288 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.458 104288 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.459 104288 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.460 104288 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.461 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.462 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.463 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.464 104288 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.465 104288 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:39:50 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:39:50.465 104288 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 04:39:52 np0005555140 systemd-logind[787]: New session 22 of user zuul.
Dec 11 04:39:52 np0005555140 systemd[1]: Started Session 22 of User zuul.
Dec 11 04:39:53 np0005555140 python3.9[104560]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:39:54 np0005555140 podman[104688]: 2025-12-11 09:39:54.598527917 +0000 UTC m=+0.117166348 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 04:39:55 np0005555140 python3.9[104737]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:39:56 np0005555140 python3.9[104907]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:39:56 np0005555140 systemd[1]: Reloading.
Dec 11 04:39:56 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:39:56 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:39:57 np0005555140 python3.9[105093]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:39:57 np0005555140 network[105110]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:39:57 np0005555140 network[105111]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:39:57 np0005555140 network[105112]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:40:00 np0005555140 python3.9[105373]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:01 np0005555140 python3.9[105526]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:02 np0005555140 python3.9[105679]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:02 np0005555140 python3.9[105832]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:03 np0005555140 python3.9[105985]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:04 np0005555140 python3.9[106138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:05 np0005555140 python3.9[106291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:40:06 np0005555140 python3.9[106444]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:06 np0005555140 python3.9[106596]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:07 np0005555140 python3.9[106748]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:08 np0005555140 python3.9[106900]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:08 np0005555140 python3.9[107052]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:09 np0005555140 python3.9[107204]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:10 np0005555140 python3.9[107356]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:10 np0005555140 python3.9[107508]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:11 np0005555140 python3.9[107660]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:11 np0005555140 python3.9[107812]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:12 np0005555140 python3.9[107964]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:13 np0005555140 python3.9[108116]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:13 np0005555140 python3.9[108268]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:14 np0005555140 python3.9[108420]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:40:14 np0005555140 python3.9[108572]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:15 np0005555140 python3.9[108724]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:40:16 np0005555140 python3.9[108876]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:40:16 np0005555140 systemd[1]: Reloading.
Dec 11 04:40:16 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:40:16 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:40:17 np0005555140 podman[108911]: 2025-12-11 09:40:17.024603879 +0000 UTC m=+0.098325240 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 04:40:17 np0005555140 python3.9[109082]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:18 np0005555140 python3.9[109235]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:18 np0005555140 python3.9[109388]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:19 np0005555140 python3.9[109541]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:20 np0005555140 python3.9[109694]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:20 np0005555140 python3.9[109847]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:21 np0005555140 python3.9[110000]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:40:22 np0005555140 python3.9[110153]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 11 04:40:23 np0005555140 python3.9[110306]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:40:24 np0005555140 python3.9[110464]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 04:40:24 np0005555140 podman[110596]: 2025-12-11 09:40:24.896035861 +0000 UTC m=+0.081769408 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:40:25 np0005555140 python3.9[110642]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:40:26 np0005555140 python3.9[110734]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:40:47 np0005555140 podman[110919]: 2025-12-11 09:40:47.701708088 +0000 UTC m=+0.065563465 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:40:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:40:48.601 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:40:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:40:48.602 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:40:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:40:48.602 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:40:55 np0005555140 podman[110945]: 2025-12-11 09:40:55.704987983 +0000 UTC m=+0.081644753 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 11 04:41:02 np0005555140 kernel: SELinux:  Converting 2759 SID table entries...
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:41:02 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  Converting 2759 SID table entries...
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:41:12 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:41:18 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 11 04:41:18 np0005555140 podman[110987]: 2025-12-11 09:41:18.719877248 +0000 UTC m=+0.088151654 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 04:41:26 np0005555140 podman[111930]: 2025-12-11 09:41:26.705294639 +0000 UTC m=+0.083571358 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 11 04:41:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:41:48.602 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:41:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:41:48.602 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:41:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:41:48.602 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:41:49 np0005555140 podman[126893]: 2025-12-11 09:41:49.684304075 +0000 UTC m=+0.049465803 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 11 04:41:57 np0005555140 podman[127858]: 2025-12-11 09:41:57.732941619 +0000 UTC m=+0.110382556 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:42:04 np0005555140 kernel: SELinux:  Converting 2760 SID table entries...
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability open_perms=1
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability always_check_network=0
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 04:42:04 np0005555140 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 04:42:05 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:42:05 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 11 04:42:05 np0005555140 dbus-broker-launch[756]: Noticed file-system modification, trigger reload.
Dec 11 04:42:12 np0005555140 systemd[1]: Stopping OpenSSH server daemon...
Dec 11 04:42:12 np0005555140 systemd[1]: sshd.service: Deactivated successfully.
Dec 11 04:42:12 np0005555140 systemd[1]: Stopped OpenSSH server daemon.
Dec 11 04:42:12 np0005555140 systemd[1]: sshd.service: Consumed 1.238s CPU time, read 32.0K from disk, written 0B to disk.
Dec 11 04:42:12 np0005555140 systemd[1]: Stopped target sshd-keygen.target.
Dec 11 04:42:12 np0005555140 systemd[1]: Stopping sshd-keygen.target...
Dec 11 04:42:12 np0005555140 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:42:12 np0005555140 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:42:12 np0005555140 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 04:42:12 np0005555140 systemd[1]: Reached target sshd-keygen.target.
Dec 11 04:42:12 np0005555140 systemd[1]: Starting OpenSSH server daemon...
Dec 11 04:42:12 np0005555140 systemd[1]: Started OpenSSH server daemon.
Dec 11 04:42:14 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:42:14 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:42:14 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:14 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:14 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:14 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:42:18 np0005555140 python3.9[133596]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:42:18 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:18 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:18 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:19 np0005555140 python3.9[134869]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:42:19 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:19 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:19 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:20 np0005555140 podman[136102]: 2025-12-11 09:42:20.496757114 +0000 UTC m=+0.055815273 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:42:20 np0005555140 python3.9[136243]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:42:20 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:20 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:20 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:21 np0005555140 python3.9[137567]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:42:21 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:21 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:21 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:22 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:42:22 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:42:22 np0005555140 systemd[1]: man-db-cache-update.service: Consumed 9.700s CPU time.
Dec 11 04:42:22 np0005555140 systemd[1]: run-r3c76d6c4efcd44d59855012d8ba82107.service: Deactivated successfully.
Dec 11 04:42:22 np0005555140 python3.9[138247]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:23 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:23 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:23 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:23 np0005555140 python3.9[138437]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:24 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:24 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:24 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:25 np0005555140 python3.9[138627]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:25 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:25 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:25 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:25 np0005555140 python3.9[138818]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:26 np0005555140 python3.9[138973]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:26 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:26 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:26 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:27 np0005555140 python3.9[139163]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 04:42:28 np0005555140 systemd[1]: Reloading.
Dec 11 04:42:28 np0005555140 podman[139165]: 2025-12-11 09:42:28.066729763 +0000 UTC m=+0.118677259 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 04:42:28 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:42:28 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:42:28 np0005555140 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 11 04:42:28 np0005555140 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 11 04:42:29 np0005555140 python3.9[139380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:29 np0005555140 python3.9[139535]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:30 np0005555140 python3.9[139690]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:31 np0005555140 python3.9[139845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:32 np0005555140 python3.9[140000]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:32 np0005555140 python3.9[140155]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:33 np0005555140 python3.9[140310]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:34 np0005555140 python3.9[140465]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:35 np0005555140 python3.9[140620]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:35 np0005555140 python3.9[140775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:36 np0005555140 python3.9[140930]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:37 np0005555140 python3.9[141085]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:38 np0005555140 python3.9[141240]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:39 np0005555140 python3.9[141395]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 04:42:39 np0005555140 python3.9[141550]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:40 np0005555140 python3.9[141702]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:41 np0005555140 python3.9[141854]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:41 np0005555140 python3.9[142006]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:42 np0005555140 python3.9[142158]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:43 np0005555140 python3.9[142310]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:42:43 np0005555140 python3.9[142462]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:44 np0005555140 python3.9[142587]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446163.2963245-554-117338705154709/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:45 np0005555140 python3.9[142739]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:46 np0005555140 python3.9[142864]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446164.9027374-554-228483360896153/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:46 np0005555140 python3.9[143016]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:47 np0005555140 python3.9[143141]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446166.1745307-554-23690936890873/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:48 np0005555140 python3.9[143293]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:42:48.604 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:42:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:42:48.604 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:42:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:42:48.605 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:42:48 np0005555140 python3.9[143418]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446167.5041692-554-2571578278358/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:49 np0005555140 python3.9[143570]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:49 np0005555140 python3.9[143695]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446168.8071637-554-193617343402158/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:50 np0005555140 python3.9[143847]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:50 np0005555140 podman[143850]: 2025-12-11 09:42:50.696975804 +0000 UTC m=+0.061508789 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 04:42:51 np0005555140 python3.9[143993]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446169.9878137-554-96051644588243/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:51 np0005555140 python3.9[144145]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:52 np0005555140 python3.9[144268]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446171.258784-554-249682414741226/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:52 np0005555140 python3.9[144420]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:42:53 np0005555140 python3.9[144545]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765446172.3468282-554-118843476165507/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:53 np0005555140 python3.9[144697]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 11 04:42:54 np0005555140 python3.9[144850]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:55 np0005555140 python3.9[145002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:55 np0005555140 python3.9[145154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:56 np0005555140 python3.9[145306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:57 np0005555140 python3.9[145458]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:57 np0005555140 python3.9[145610]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:58 np0005555140 python3.9[145762]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:58 np0005555140 podman[145817]: 2025-12-11 09:42:58.784580441 +0000 UTC m=+0.141332330 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:42:59 np0005555140 python3.9[145938]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:42:59 np0005555140 python3.9[146090]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:00 np0005555140 python3.9[146242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:00 np0005555140 python3.9[146394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:01 np0005555140 python3.9[146546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:02 np0005555140 python3.9[146698]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:02 np0005555140 python3.9[146850]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:03 np0005555140 python3.9[147002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:04 np0005555140 python3.9[147125]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446183.046524-775-201802011170594/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:04 np0005555140 python3.9[147277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:05 np0005555140 python3.9[147400]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446184.2446783-775-266631922704501/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:05 np0005555140 python3.9[147552]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:06 np0005555140 python3.9[147675]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446185.5020497-775-13775479167883/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:07 np0005555140 python3.9[147827]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:07 np0005555140 python3.9[147950]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446186.658908-775-219549780217227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:08 np0005555140 python3.9[148102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:09 np0005555140 python3.9[148225]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446187.8927398-775-264385891466477/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:09 np0005555140 python3.9[148377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:10 np0005555140 python3.9[148500]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446189.1966383-775-250164401192681/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:10 np0005555140 python3.9[148652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:11 np0005555140 python3.9[148775]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446190.2713013-775-65394881163684/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:11 np0005555140 python3.9[148927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:12 np0005555140 python3.9[149050]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446191.3870466-775-36241673004046/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:13 np0005555140 python3.9[149202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:13 np0005555140 python3.9[149325]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446192.5536919-775-269036069389749/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:14 np0005555140 python3.9[149477]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:14 np0005555140 python3.9[149600]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446193.6711075-775-25991132497383/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:15 np0005555140 python3.9[149752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:15 np0005555140 python3.9[149875]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446194.8653138-775-62091829386126/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:16 np0005555140 python3.9[150027]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:17 np0005555140 python3.9[150150]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446196.036414-775-228417412518344/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:17 np0005555140 python3.9[150302]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:18 np0005555140 python3.9[150425]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446197.264876-775-173168451914818/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:18 np0005555140 python3.9[150577]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:19 np0005555140 python3.9[150700]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446198.4461029-775-256505185368448/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:20 np0005555140 python3.9[150850]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:43:20 np0005555140 python3.9[151005]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 11 04:43:21 np0005555140 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 11 04:43:21 np0005555140 podman[151007]: 2025-12-11 09:43:21.681784636 +0000 UTC m=+0.055477483 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 04:43:22 np0005555140 python3.9[151181]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:23 np0005555140 python3.9[151333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:23 np0005555140 python3.9[151485]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:24 np0005555140 python3.9[151637]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:25 np0005555140 python3.9[151789]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:25 np0005555140 python3.9[151941]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:26 np0005555140 python3.9[152093]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:27 np0005555140 python3.9[152245]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:27 np0005555140 python3.9[152397]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:28 np0005555140 python3.9[152549]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:29 np0005555140 podman[152673]: 2025-12-11 09:43:29.073943865 +0000 UTC m=+0.117356087 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 11 04:43:29 np0005555140 python3.9[152721]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:43:29 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:29 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:29 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:29 np0005555140 systemd[1]: Starting libvirt logging daemon socket...
Dec 11 04:43:29 np0005555140 systemd[1]: Listening on libvirt logging daemon socket.
Dec 11 04:43:29 np0005555140 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 11 04:43:29 np0005555140 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 11 04:43:29 np0005555140 systemd[1]: Starting libvirt logging daemon...
Dec 11 04:43:29 np0005555140 systemd[1]: Started libvirt logging daemon.
Dec 11 04:43:30 np0005555140 python3.9[152920]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:43:30 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:30 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:30 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:30 np0005555140 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 11 04:43:30 np0005555140 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 11 04:43:30 np0005555140 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 11 04:43:30 np0005555140 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 11 04:43:30 np0005555140 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 11 04:43:30 np0005555140 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 11 04:43:30 np0005555140 systemd[1]: Starting libvirt nodedev daemon...
Dec 11 04:43:30 np0005555140 systemd[1]: Started libvirt nodedev daemon.
Dec 11 04:43:31 np0005555140 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 11 04:43:31 np0005555140 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 11 04:43:31 np0005555140 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 11 04:43:31 np0005555140 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 11 04:43:31 np0005555140 python3.9[153137]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:43:31 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:31 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:31 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:31 np0005555140 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 11 04:43:31 np0005555140 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 11 04:43:31 np0005555140 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 11 04:43:31 np0005555140 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 11 04:43:31 np0005555140 systemd[1]: Starting libvirt proxy daemon...
Dec 11 04:43:32 np0005555140 systemd[1]: Started libvirt proxy daemon.
Dec 11 04:43:32 np0005555140 setroubleshoot[153012]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 273555f4-a8a9-4323-a8cf-9907bce981f8
Dec 11 04:43:32 np0005555140 setroubleshoot[153012]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec 11 04:43:32 np0005555140 setroubleshoot[153012]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 273555f4-a8a9-4323-a8cf-9907bce981f8
Dec 11 04:43:32 np0005555140 setroubleshoot[153012]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec 11 04:43:32 np0005555140 python3.9[153357]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:43:32 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:32 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:32 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:33 np0005555140 systemd[1]: Listening on libvirt locking daemon socket.
Dec 11 04:43:33 np0005555140 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 11 04:43:33 np0005555140 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 11 04:43:33 np0005555140 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 11 04:43:33 np0005555140 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 11 04:43:33 np0005555140 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 11 04:43:33 np0005555140 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 11 04:43:33 np0005555140 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 11 04:43:33 np0005555140 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 11 04:43:33 np0005555140 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 11 04:43:33 np0005555140 systemd[1]: Starting libvirt QEMU daemon...
Dec 11 04:43:33 np0005555140 systemd[1]: Started libvirt QEMU daemon.
Dec 11 04:43:33 np0005555140 python3.9[153573]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:43:33 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:34 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:34 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:34 np0005555140 systemd[1]: Starting libvirt secret daemon socket...
Dec 11 04:43:34 np0005555140 systemd[1]: Listening on libvirt secret daemon socket.
Dec 11 04:43:34 np0005555140 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 11 04:43:34 np0005555140 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 11 04:43:34 np0005555140 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 11 04:43:34 np0005555140 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 11 04:43:34 np0005555140 systemd[1]: Starting libvirt secret daemon...
Dec 11 04:43:34 np0005555140 systemd[1]: Started libvirt secret daemon.
Dec 11 04:43:35 np0005555140 python3.9[153784]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:35 np0005555140 python3.9[153936]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:43:36 np0005555140 python3.9[154088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:37 np0005555140 python3.9[154211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446216.1250658-1120-2190128850666/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:37 np0005555140 python3.9[154363]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:38 np0005555140 python3.9[154515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:38 np0005555140 python3.9[154593]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:39 np0005555140 python3.9[154745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:40 np0005555140 python3.9[154823]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gwc_9ni_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:40 np0005555140 python3.9[154975]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:41 np0005555140 python3.9[155053]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:42 np0005555140 python3.9[155205]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:43:42 np0005555140 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 11 04:43:42 np0005555140 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.016s CPU time.
Dec 11 04:43:42 np0005555140 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 11 04:43:42 np0005555140 python3[155358]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 04:43:43 np0005555140 python3.9[155510]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:43 np0005555140 python3.9[155588]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:44 np0005555140 python3.9[155740]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:45 np0005555140 python3.9[155818]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:45 np0005555140 python3.9[155970]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:46 np0005555140 python3.9[156048]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:46 np0005555140 python3.9[156200]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:47 np0005555140 python3.9[156278]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:47 np0005555140 python3.9[156430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:48 np0005555140 python3.9[156555]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446227.4159281-1245-148527793573458/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:43:48.606 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:43:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:43:48.607 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:43:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:43:48.607 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:43:49 np0005555140 python3.9[156707]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:49 np0005555140 python3.9[156859]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:43:50 np0005555140 python3.9[157014]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:51 np0005555140 python3.9[157166]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:43:52 np0005555140 podman[157291]: 2025-12-11 09:43:52.063165062 +0000 UTC m=+0.061955837 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 04:43:52 np0005555140 python3.9[157330]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:43:52 np0005555140 python3.9[157492]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:43:53 np0005555140 python3.9[157647]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:54 np0005555140 python3.9[157799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:54 np0005555140 python3.9[157922]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446233.8186405-1317-105046293173739/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:55 np0005555140 python3.9[158074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:55 np0005555140 python3.9[158197]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446234.9814758-1332-48680379565021/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:56 np0005555140 python3.9[158349]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:43:57 np0005555140 python3.9[158472]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446236.1297421-1347-24421631014771/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:43:57 np0005555140 python3.9[158624]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:43:57 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:58 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:58 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:58 np0005555140 systemd[1]: Reached target edpm_libvirt.target.
Dec 11 04:43:59 np0005555140 python3.9[158816]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 11 04:43:59 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:59 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:43:59 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:59 np0005555140 podman[158818]: 2025-12-11 09:43:59.241753288 +0000 UTC m=+0.098004838 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:43:59 np0005555140 systemd[1]: Reloading.
Dec 11 04:43:59 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:43:59 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:44:00 np0005555140 systemd[1]: session-22.scope: Deactivated successfully.
Dec 11 04:44:00 np0005555140 systemd[1]: session-22.scope: Consumed 3min 18.873s CPU time.
Dec 11 04:44:00 np0005555140 systemd-logind[787]: Session 22 logged out. Waiting for processes to exit.
Dec 11 04:44:00 np0005555140 systemd-logind[787]: Removed session 22.
Dec 11 04:44:05 np0005555140 systemd-logind[787]: New session 23 of user zuul.
Dec 11 04:44:05 np0005555140 systemd[1]: Started Session 23 of User zuul.
Dec 11 04:44:06 np0005555140 python3.9[159092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:44:08 np0005555140 python3.9[159246]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:44:08 np0005555140 network[159263]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:44:08 np0005555140 network[159264]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:44:08 np0005555140 network[159265]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:44:12 np0005555140 python3.9[159536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 04:44:13 np0005555140 python3.9[159620]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:44:20 np0005555140 python3.9[159773]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:44:21 np0005555140 python3.9[159925]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:44:22 np0005555140 python3.9[160078]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:44:22 np0005555140 podman[160202]: 2025-12-11 09:44:22.577638325 +0000 UTC m=+0.060574016 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 11 04:44:22 np0005555140 python3.9[160247]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:44:23 np0005555140 python3.9[160402]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:24 np0005555140 python3.9[160525]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446262.932686-95-56946366606058/.source.iscsi _original_basename=.r0tje7v4 follow=False checksum=947d91dd05e1253c1a896324f0b6adab1f6f7295 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:25 np0005555140 python3.9[160677]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:25 np0005555140 python3.9[160829]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:25 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:44:25 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:44:27 np0005555140 python3.9[160982]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:44:28 np0005555140 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 11 04:44:28 np0005555140 python3.9[161139]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:44:28 np0005555140 systemd[1]: Reloading.
Dec 11 04:44:29 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:44:29 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:44:29 np0005555140 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 11 04:44:29 np0005555140 systemd[1]: Starting Open-iSCSI...
Dec 11 04:44:29 np0005555140 kernel: Loading iSCSI transport class v2.0-870.
Dec 11 04:44:29 np0005555140 systemd[1]: Started Open-iSCSI.
Dec 11 04:44:29 np0005555140 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 11 04:44:29 np0005555140 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 11 04:44:29 np0005555140 podman[161214]: 2025-12-11 09:44:29.7378305 +0000 UTC m=+0.103566626 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 04:44:30 np0005555140 python3.9[161368]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:44:30 np0005555140 network[161385]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:44:30 np0005555140 network[161386]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:44:30 np0005555140 network[161387]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:44:33 np0005555140 python3.9[161658]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 04:44:34 np0005555140 python3.9[161810]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 11 04:44:35 np0005555140 python3.9[161966]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:35 np0005555140 python3.9[162089]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446274.918219-172-112573334247385/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:36 np0005555140 python3.9[162241]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:37 np0005555140 python3.9[162393]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:44:37 np0005555140 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 11 04:44:37 np0005555140 systemd[1]: Stopped Load Kernel Modules.
Dec 11 04:44:37 np0005555140 systemd[1]: Stopping Load Kernel Modules...
Dec 11 04:44:37 np0005555140 systemd[1]: Starting Load Kernel Modules...
Dec 11 04:44:37 np0005555140 systemd[1]: Finished Load Kernel Modules.
Dec 11 04:44:38 np0005555140 python3.9[162549]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:44:39 np0005555140 python3.9[162701]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:44:39 np0005555140 python3.9[162853]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:44:40 np0005555140 python3.9[163005]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:40 np0005555140 python3.9[163128]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446279.9546416-230-113398575716207/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:41 np0005555140 python3.9[163280]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:44:42 np0005555140 python3.9[163433]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:43 np0005555140 python3.9[163585]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:43 np0005555140 python3.9[163737]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:44 np0005555140 python3.9[163889]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:44 np0005555140 python3.9[164041]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:45 np0005555140 python3.9[164193]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:46 np0005555140 python3.9[164345]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:46 np0005555140 python3.9[164497]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:44:47 np0005555140 python3.9[164651]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:48 np0005555140 python3.9[164803]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:44:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:44:48.608 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:44:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:44:48.609 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:44:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:44:48.609 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:44:48 np0005555140 python3.9[164955]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:49 np0005555140 python3.9[165033]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:44:49 np0005555140 python3.9[165185]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:50 np0005555140 python3.9[165263]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:44:50 np0005555140 python3.9[165415]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:51 np0005555140 python3.9[165567]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:51 np0005555140 python3.9[165645]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:52 np0005555140 python3.9[165797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:52 np0005555140 podman[165798]: 2025-12-11 09:44:52.697644406 +0000 UTC m=+0.066209375 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:44:53 np0005555140 python3.9[165895]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:53 np0005555140 python3.9[166047]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:44:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:44:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:44:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:44:54 np0005555140 python3.9[166235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:55 np0005555140 python3.9[166313]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:55 np0005555140 python3.9[166465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:56 np0005555140 python3.9[166543]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:44:57 np0005555140 python3.9[166695]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:44:57 np0005555140 systemd[1]: Reloading.
Dec 11 04:44:57 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:44:57 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:44:57 np0005555140 systemd[1]: Starting Create netns directory...
Dec 11 04:44:57 np0005555140 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 04:44:57 np0005555140 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 04:44:57 np0005555140 systemd[1]: Finished Create netns directory.
Dec 11 04:44:58 np0005555140 python3.9[166888]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:44:58 np0005555140 python3.9[167040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:44:59 np0005555140 python3.9[167163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446298.5430863-437-76675376548153/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:45:00 np0005555140 podman[167287]: 2025-12-11 09:45:00.278304898 +0000 UTC m=+0.088905904 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:45:00 np0005555140 python3.9[167332]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:45:01 np0005555140 python3.9[167494]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:45:01 np0005555140 python3.9[167617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446300.8215435-462-206954415121918/.source.json _original_basename=.tsl4equw follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:02 np0005555140 python3.9[167769]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:04 np0005555140 python3.9[168196]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 11 04:45:05 np0005555140 python3.9[168348]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:45:06 np0005555140 python3.9[168500]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 04:45:07 np0005555140 python3[168679]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:45:07 np0005555140 podman[168713]: 2025-12-11 09:45:07.965113778 +0000 UTC m=+0.048974920 container create 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:45:07 np0005555140 podman[168713]: 2025-12-11 09:45:07.935807333 +0000 UTC m=+0.019668465 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f
Dec 11 04:45:07 np0005555140 python3[168679]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f
Dec 11 04:45:08 np0005555140 python3.9[168904]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:45:09 np0005555140 python3.9[169058]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:10 np0005555140 python3.9[169134]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:45:10 np0005555140 python3.9[169285]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446310.1532235-550-263059488852865/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:11 np0005555140 python3.9[169361]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:45:11 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:11 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:11 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:12 np0005555140 python3.9[169472]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:12 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:12 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:12 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:12 np0005555140 systemd[1]: Starting multipathd container...
Dec 11 04:45:12 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:45:12 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/812a525c46919cd11bd59facb1e52cd2c23aff390fc8d6bc89ef7499cfd939f9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 04:45:12 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/812a525c46919cd11bd59facb1e52cd2c23aff390fc8d6bc89ef7499cfd939f9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 04:45:12 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.
Dec 11 04:45:12 np0005555140 podman[169512]: 2025-12-11 09:45:12.591111286 +0000 UTC m=+0.112868058 container init 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:45:12 np0005555140 multipathd[169527]: + sudo -E kolla_set_configs
Dec 11 04:45:12 np0005555140 podman[169512]: 2025-12-11 09:45:12.618166648 +0000 UTC m=+0.139923410 container start 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 11 04:45:12 np0005555140 podman[169512]: multipathd
Dec 11 04:45:12 np0005555140 systemd[1]: Started multipathd container.
Dec 11 04:45:12 np0005555140 multipathd[169527]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:45:12 np0005555140 multipathd[169527]: INFO:__main__:Validating config file
Dec 11 04:45:12 np0005555140 multipathd[169527]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:45:12 np0005555140 multipathd[169527]: INFO:__main__:Writing out command to execute
Dec 11 04:45:12 np0005555140 podman[169534]: 2025-12-11 09:45:12.700115274 +0000 UTC m=+0.069466656 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 11 04:45:12 np0005555140 multipathd[169527]: ++ cat /run_command
Dec 11 04:45:12 np0005555140 systemd[1]: 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248-33337ad4c1080987.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:45:12 np0005555140 systemd[1]: 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248-33337ad4c1080987.service: Failed with result 'exit-code'.
Dec 11 04:45:12 np0005555140 multipathd[169527]: + CMD='/usr/sbin/multipathd -d'
Dec 11 04:45:12 np0005555140 multipathd[169527]: + ARGS=
Dec 11 04:45:12 np0005555140 multipathd[169527]: + sudo kolla_copy_cacerts
Dec 11 04:45:12 np0005555140 multipathd[169527]: + [[ ! -n '' ]]
Dec 11 04:45:12 np0005555140 multipathd[169527]: + . kolla_extend_start
Dec 11 04:45:12 np0005555140 multipathd[169527]: Running command: '/usr/sbin/multipathd -d'
Dec 11 04:45:12 np0005555140 multipathd[169527]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 11 04:45:12 np0005555140 multipathd[169527]: + umask 0022
Dec 11 04:45:12 np0005555140 multipathd[169527]: + exec /usr/sbin/multipathd -d
Dec 11 04:45:12 np0005555140 multipathd[169527]: 2671.370428 | --------start up--------
Dec 11 04:45:12 np0005555140 multipathd[169527]: 2671.370450 | read /etc/multipath.conf
Dec 11 04:45:12 np0005555140 multipathd[169527]: 2671.376709 | path checkers start up
Dec 11 04:45:13 np0005555140 python3.9[169717]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:45:13 np0005555140 python3.9[169871]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:14 np0005555140 python3.9[170036]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:45:14 np0005555140 systemd[1]: Stopping multipathd container...
Dec 11 04:45:15 np0005555140 multipathd[169527]: 2673.737647 | exit (signal)
Dec 11 04:45:15 np0005555140 multipathd[169527]: 2673.737713 | --------shut down-------
Dec 11 04:45:15 np0005555140 systemd[1]: libpod-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.scope: Deactivated successfully.
Dec 11 04:45:15 np0005555140 podman[170040]: 2025-12-11 09:45:15.154296629 +0000 UTC m=+0.343648085 container died 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 04:45:15 np0005555140 systemd[1]: 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248-33337ad4c1080987.timer: Deactivated successfully.
Dec 11 04:45:15 np0005555140 systemd[1]: Stopped /usr/bin/podman healthcheck run 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.
Dec 11 04:45:15 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248-userdata-shm.mount: Deactivated successfully.
Dec 11 04:45:15 np0005555140 systemd[1]: var-lib-containers-storage-overlay-812a525c46919cd11bd59facb1e52cd2c23aff390fc8d6bc89ef7499cfd939f9-merged.mount: Deactivated successfully.
Dec 11 04:45:15 np0005555140 podman[170040]: 2025-12-11 09:45:15.671060316 +0000 UTC m=+0.860411732 container cleanup 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:45:15 np0005555140 podman[170040]: multipathd
Dec 11 04:45:15 np0005555140 podman[170069]: multipathd
Dec 11 04:45:15 np0005555140 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 11 04:45:15 np0005555140 systemd[1]: Stopped multipathd container.
Dec 11 04:45:15 np0005555140 systemd[1]: Starting multipathd container...
Dec 11 04:45:16 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:45:16 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/812a525c46919cd11bd59facb1e52cd2c23aff390fc8d6bc89ef7499cfd939f9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 04:45:16 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/812a525c46919cd11bd59facb1e52cd2c23aff390fc8d6bc89ef7499cfd939f9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 04:45:16 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.
Dec 11 04:45:16 np0005555140 podman[170082]: 2025-12-11 09:45:16.418579259 +0000 UTC m=+0.645620656 container init 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd)
Dec 11 04:45:16 np0005555140 multipathd[170099]: + sudo -E kolla_set_configs
Dec 11 04:45:16 np0005555140 podman[170082]: 2025-12-11 09:45:16.461017173 +0000 UTC m=+0.688058530 container start 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 04:45:16 np0005555140 multipathd[170099]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:45:16 np0005555140 multipathd[170099]: INFO:__main__:Validating config file
Dec 11 04:45:16 np0005555140 multipathd[170099]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:45:16 np0005555140 multipathd[170099]: INFO:__main__:Writing out command to execute
Dec 11 04:45:16 np0005555140 multipathd[170099]: ++ cat /run_command
Dec 11 04:45:16 np0005555140 multipathd[170099]: + CMD='/usr/sbin/multipathd -d'
Dec 11 04:45:16 np0005555140 multipathd[170099]: + ARGS=
Dec 11 04:45:16 np0005555140 multipathd[170099]: + sudo kolla_copy_cacerts
Dec 11 04:45:16 np0005555140 multipathd[170099]: + [[ ! -n '' ]]
Dec 11 04:45:16 np0005555140 multipathd[170099]: + . kolla_extend_start
Dec 11 04:45:16 np0005555140 multipathd[170099]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 11 04:45:16 np0005555140 multipathd[170099]: Running command: '/usr/sbin/multipathd -d'
Dec 11 04:45:16 np0005555140 multipathd[170099]: + umask 0022
Dec 11 04:45:16 np0005555140 multipathd[170099]: + exec /usr/sbin/multipathd -d
Dec 11 04:45:16 np0005555140 multipathd[170099]: 2675.199995 | --------start up--------
Dec 11 04:45:16 np0005555140 multipathd[170099]: 2675.200071 | read /etc/multipath.conf
Dec 11 04:45:16 np0005555140 multipathd[170099]: 2675.209833 | path checkers start up
Dec 11 04:45:16 np0005555140 podman[170082]: multipathd
Dec 11 04:45:16 np0005555140 systemd[1]: Started multipathd container.
Dec 11 04:45:16 np0005555140 podman[170106]: 2025-12-11 09:45:16.741244671 +0000 UTC m=+0.262088419 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:45:17 np0005555140 python3.9[170290]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:18 np0005555140 python3.9[170442]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 04:45:18 np0005555140 python3.9[170594]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 11 04:45:18 np0005555140 kernel: Key type psk registered
Dec 11 04:45:19 np0005555140 python3.9[170759]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:45:20 np0005555140 python3.9[170882]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446319.1057947-630-266901969965726/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:20 np0005555140 python3.9[171034]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:21 np0005555140 python3.9[171186]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:45:21 np0005555140 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 11 04:45:21 np0005555140 systemd[1]: Stopped Load Kernel Modules.
Dec 11 04:45:21 np0005555140 systemd[1]: Stopping Load Kernel Modules...
Dec 11 04:45:21 np0005555140 systemd[1]: Starting Load Kernel Modules...
Dec 11 04:45:21 np0005555140 systemd[1]: Finished Load Kernel Modules.
Dec 11 04:45:22 np0005555140 python3.9[171342]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 04:45:23 np0005555140 podman[171345]: 2025-12-11 09:45:23.67972958 +0000 UTC m=+0.048352660 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Dec 11 04:45:25 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:25 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:25 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:25 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:25 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:25 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:25 np0005555140 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 11 04:45:25 np0005555140 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 11 04:45:25 np0005555140 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 04:45:25 np0005555140 systemd[1]: Starting man-db-cache-update.service...
Dec 11 04:45:25 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:26 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:26 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:26 np0005555140 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 04:45:27 np0005555140 python3.9[172794]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:45:27 np0005555140 systemd[1]: Stopping Open-iSCSI...
Dec 11 04:45:27 np0005555140 iscsid[161179]: iscsid shutting down.
Dec 11 04:45:27 np0005555140 systemd[1]: iscsid.service: Deactivated successfully.
Dec 11 04:45:27 np0005555140 systemd[1]: Stopped Open-iSCSI.
Dec 11 04:45:27 np0005555140 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 11 04:45:27 np0005555140 systemd[1]: Starting Open-iSCSI...
Dec 11 04:45:27 np0005555140 systemd[1]: Started Open-iSCSI.
Dec 11 04:45:28 np0005555140 python3.9[172980]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:45:29 np0005555140 python3.9[173136]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:30 np0005555140 python3.9[173288]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:45:30 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:30 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:30 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:30 np0005555140 podman[173290]: 2025-12-11 09:45:30.477744455 +0000 UTC m=+0.122370026 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 04:45:30 np0005555140 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 04:45:30 np0005555140 systemd[1]: Finished man-db-cache-update.service.
Dec 11 04:45:30 np0005555140 systemd[1]: man-db-cache-update.service: Consumed 1.592s CPU time.
Dec 11 04:45:30 np0005555140 systemd[1]: run-r810b36c79b0b41fcb7aee537440494c6.service: Deactivated successfully.
Dec 11 04:45:30 np0005555140 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 11 04:45:31 np0005555140 python3.9[173499]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:45:31 np0005555140 network[173516]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:45:31 np0005555140 network[173517]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:45:31 np0005555140 network[173518]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:45:32 np0005555140 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 11 04:45:33 np0005555140 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 11 04:45:34 np0005555140 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 11 04:45:35 np0005555140 python3.9[173795]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:35 np0005555140 python3.9[173948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:36 np0005555140 python3.9[174101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:37 np0005555140 python3.9[174254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:38 np0005555140 python3.9[174407]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:38 np0005555140 python3.9[174560]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:39 np0005555140 python3.9[174713]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:40 np0005555140 python3.9[174866]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:45:41 np0005555140 python3.9[175019]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:41 np0005555140 python3.9[175171]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:42 np0005555140 python3.9[175323]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:43 np0005555140 python3.9[175475]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:43 np0005555140 python3.9[175627]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:44 np0005555140 python3.9[175779]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:44 np0005555140 python3.9[175931]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:45 np0005555140 python3.9[176083]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:46 np0005555140 python3.9[176235]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:46 np0005555140 python3.9[176387]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:47 np0005555140 podman[176511]: 2025-12-11 09:45:47.322927369 +0000 UTC m=+0.098888961 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 04:45:47 np0005555140 python3.9[176559]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:48 np0005555140 python3.9[176712]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:45:48.609 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:45:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:45:48.610 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:45:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:45:48.610 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:45:48 np0005555140 python3.9[176864]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:49 np0005555140 python3.9[177016]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:50 np0005555140 python3.9[177168]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:50 np0005555140 python3.9[177320]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:45:51 np0005555140 python3.9[177472]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:52 np0005555140 python3.9[177625]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:45:53 np0005555140 python3.9[177777]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:45:53 np0005555140 systemd[1]: Reloading.
Dec 11 04:45:53 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:45:53 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:45:53 np0005555140 podman[177779]: 2025-12-11 09:45:53.825693673 +0000 UTC m=+0.094498476 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:45:54 np0005555140 python3.9[177983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:55 np0005555140 python3.9[178136]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:55 np0005555140 python3.9[178289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:56 np0005555140 python3.9[178442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:57 np0005555140 python3.9[178595]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:57 np0005555140 python3.9[178748]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:58 np0005555140 python3.9[178901]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:45:59 np0005555140 python3.9[179054]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:46:01 np0005555140 podman[179179]: 2025-12-11 09:46:01.610872508 +0000 UTC m=+0.093843857 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:46:01 np0005555140 python3.9[179225]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:02 np0005555140 python3.9[179384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:03 np0005555140 python3.9[179536]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:03 np0005555140 python3.9[179688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:04 np0005555140 python3.9[179840]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:05 np0005555140 python3.9[179992]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:05 np0005555140 python3.9[180144]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:06 np0005555140 python3.9[180296]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:06 np0005555140 python3.9[180448]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:07 np0005555140 python3.9[180600]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:12 np0005555140 python3.9[180752]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 11 04:46:12 np0005555140 python3.9[180905]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:46:13 np0005555140 python3.9[181063]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 04:46:15 np0005555140 systemd-logind[787]: New session 24 of user zuul.
Dec 11 04:46:15 np0005555140 systemd[1]: Started Session 24 of User zuul.
Dec 11 04:46:15 np0005555140 systemd[1]: session-24.scope: Deactivated successfully.
Dec 11 04:46:15 np0005555140 systemd-logind[787]: Session 24 logged out. Waiting for processes to exit.
Dec 11 04:46:15 np0005555140 systemd-logind[787]: Removed session 24.
Dec 11 04:46:15 np0005555140 python3.9[181249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:16 np0005555140 python3.9[181370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446375.4743447-1229-187082385231689/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:16 np0005555140 python3.9[181520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:17 np0005555140 python3.9[181596]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:17 np0005555140 podman[181597]: 2025-12-11 09:46:17.559824773 +0000 UTC m=+0.067503610 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:46:18 np0005555140 python3.9[181767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:18 np0005555140 python3.9[181888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446377.6455932-1229-186508878988478/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:19 np0005555140 python3.9[182038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:19 np0005555140 python3.9[182159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446378.7402577-1229-170301815621133/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:20 np0005555140 python3.9[182309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:20 np0005555140 python3.9[182430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446379.8098044-1229-270648023269159/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:21 np0005555140 python3.9[182580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:21 np0005555140 python3.9[182701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446380.9986432-1229-75031286976505/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:22 np0005555140 python3.9[182853]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:46:23 np0005555140 python3.9[183005]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:46:23 np0005555140 python3.9[183157]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:24 np0005555140 podman[183281]: 2025-12-11 09:46:24.389913538 +0000 UTC m=+0.055108565 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:46:24 np0005555140 python3.9[183327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:25 np0005555140 python3.9[183450]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765446384.0886896-1336-176027279582381/.source _original_basename=.vacswfh4 follow=False checksum=d6187947ae90fecc7908a2e0700bc0ef57b8ea38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 11 04:46:25 np0005555140 python3.9[183602]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:26 np0005555140 python3.9[183754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:27 np0005555140 python3.9[183875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446386.1075022-1362-76083681076922/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=209f20105d13c02e6cb251483bae1beb11a1258f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:27 np0005555140 python3.9[184025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:46:28 np0005555140 python3.9[184146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446387.2436323-1377-113680576035927/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:46:29 np0005555140 python3.9[184298]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 11 04:46:29 np0005555140 python3.9[184450]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:46:30 np0005555140 python3[184602]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:46:30 np0005555140 podman[184638]: 2025-12-11 09:46:30.963055248 +0000 UTC m=+0.058520983 container create 98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, container_name=nova_compute_init, config_id=edpm)
Dec 11 04:46:30 np0005555140 podman[184638]: 2025-12-11 09:46:30.927713232 +0000 UTC m=+0.023178987 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Dec 11 04:46:30 np0005555140 python3[184602]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 11 04:46:31 np0005555140 python3.9[184828]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:32 np0005555140 podman[184954]: 2025-12-11 09:46:32.45946585 +0000 UTC m=+0.070359704 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 04:46:32 np0005555140 python3.9[185001]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 11 04:46:33 np0005555140 python3.9[185159]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:46:34 np0005555140 python3[185311]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:46:34 np0005555140 podman[185349]: 2025-12-11 09:46:34.365628123 +0000 UTC m=+0.045025805 container create b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 11 04:46:34 np0005555140 podman[185349]: 2025-12-11 09:46:34.34256214 +0000 UTC m=+0.021959852 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Dec 11 04:46:34 np0005555140 python3[185311]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Dec 11 04:46:35 np0005555140 python3.9[185540]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:35 np0005555140 python3.9[185694]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:46:36 np0005555140 python3.9[185845]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446395.9419305-1469-6133400048356/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:46:37 np0005555140 python3.9[185921]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:46:37 np0005555140 systemd[1]: Reloading.
Dec 11 04:46:37 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:46:37 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:46:37 np0005555140 python3.9[186033]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:46:38 np0005555140 systemd[1]: Reloading.
Dec 11 04:46:38 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:46:38 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:46:38 np0005555140 systemd[1]: Starting nova_compute container...
Dec 11 04:46:38 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:46:38 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:38 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:38 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:38 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:38 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:38 np0005555140 podman[186073]: 2025-12-11 09:46:38.445472609 +0000 UTC m=+0.135561588 container init b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, container_name=nova_compute)
Dec 11 04:46:38 np0005555140 podman[186073]: 2025-12-11 09:46:38.452653815 +0000 UTC m=+0.142742784 container start b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:46:38 np0005555140 podman[186073]: nova_compute
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + sudo -E kolla_set_configs
Dec 11 04:46:38 np0005555140 systemd[1]: Started nova_compute container.
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Validating config file
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying service configuration files
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Deleting /etc/ceph
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Creating directory /etc/ceph
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /etc/ceph
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Writing out command to execute
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:38 np0005555140 nova_compute[186088]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 04:46:38 np0005555140 nova_compute[186088]: ++ cat /run_command
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + CMD=nova-compute
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + ARGS=
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + sudo kolla_copy_cacerts
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + [[ ! -n '' ]]
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + . kolla_extend_start
Dec 11 04:46:38 np0005555140 nova_compute[186088]: Running command: 'nova-compute'
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + echo 'Running command: '\''nova-compute'\'''
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + umask 0022
Dec 11 04:46:38 np0005555140 nova_compute[186088]: + exec nova-compute
Dec 11 04:46:39 np0005555140 python3.9[186249]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:40 np0005555140 python3.9[186400]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.573 186092 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.573 186092 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.573 186092 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.574 186092 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec 11 04:46:40 np0005555140 python3.9[186552]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.721 186092 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.744 186092 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:46:40 np0005555140 nova_compute[186088]: 2025-12-11 09:46:40.745 186092 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.299 186092 INFO nova.virt.driver [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.405 186092 INFO nova.compute.provider_config [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.419 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.419 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.420 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.421 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.422 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.423 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.424 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.425 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.426 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.427 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.427 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.427 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.427 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.427 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.428 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.429 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.430 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.431 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.431 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.431 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.431 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.432 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.433 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.434 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.435 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.436 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.437 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.438 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.439 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.440 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.441 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.442 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.443 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.444 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.445 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.446 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.447 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.448 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.449 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.450 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.450 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.450 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.450 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.450 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.451 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.452 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.452 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.452 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.452 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.452 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.453 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.454 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.454 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.454 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.454 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.454 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.455 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.456 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.457 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.458 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.459 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.460 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.461 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.462 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.463 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.464 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.465 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.466 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.467 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.468 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.469 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.470 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.471 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.472 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.473 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.474 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.475 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.475 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.475 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.475 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.475 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.476 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.477 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.478 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.478 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.478 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.478 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.478 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.479 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.480 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.480 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.480 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.480 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.481 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.481 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.481 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.481 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.481 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.482 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.482 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.482 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.482 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.482 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.483 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.483 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.483 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.483 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.483 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.484 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.485 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.486 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.487 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.488 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.489 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.490 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.491 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.492 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.492 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.492 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.492 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.492 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.493 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.494 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.495 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.496 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.497 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.498 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.499 186092 WARNING oslo_config.cfg [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 11 04:46:41 np0005555140 nova_compute[186088]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 11 04:46:41 np0005555140 nova_compute[186088]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 11 04:46:41 np0005555140 nova_compute[186088]: and ``live_migration_inbound_addr`` respectively.
Dec 11 04:46:41 np0005555140 nova_compute[186088]: ).  Its value may be silently ignored in the future.#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.499 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.499 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.499 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.499 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.500 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.501 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.502 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.503 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.504 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.504 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.504 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.504 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.504 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.505 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.506 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.507 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.508 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.509 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.510 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.511 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.512 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.513 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.514 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.515 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.516 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.517 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.517 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.517 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.517 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.517 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.518 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.519 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.520 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.521 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.521 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.521 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.521 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.521 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.522 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.522 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.522 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.522 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.522 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.523 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.523 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.523 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.523 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.523 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.524 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.524 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.524 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.524 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.524 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.525 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.525 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.525 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.525 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.525 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.526 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.527 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.528 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.529 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.530 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.530 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.530 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.530 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.530 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.531 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.531 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.531 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.531 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.531 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.532 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.533 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.534 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.535 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.536 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.536 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.536 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.536 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.536 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.537 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.537 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.537 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.537 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.537 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.538 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.538 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.538 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.538 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.538 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.539 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.539 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.539 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.539 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.539 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.540 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.540 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.540 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.540 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.541 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.541 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.541 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.541 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.542 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.542 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.542 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.542 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.542 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.543 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.544 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.544 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.544 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.544 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.545 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.545 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.545 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.545 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.545 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.546 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.546 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.546 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.546 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.546 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.547 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.547 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.547 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.547 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.547 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.548 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.548 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.548 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.548 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.548 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.549 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.549 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.549 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.549 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.550 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.550 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.550 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.550 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.550 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.551 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.551 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.551 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.551 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.551 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.552 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.552 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.552 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.552 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.552 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.553 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.553 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.553 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.553 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.553 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.554 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.554 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.554 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.554 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.554 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.555 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.556 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.557 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.558 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.558 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.558 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.558 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.558 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.559 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.560 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.561 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.562 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.563 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.564 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.565 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.566 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.567 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.568 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.568 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.568 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.568 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.568 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.569 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.570 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.571 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.572 186092 DEBUG oslo_service.service [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.573 186092 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.588 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.588 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.589 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.589 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec 11 04:46:41 np0005555140 systemd[1]: Starting libvirt QEMU daemon...
Dec 11 04:46:41 np0005555140 systemd[1]: Started libvirt QEMU daemon.
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.660 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9292d20e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.663 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9292d20e20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.663 186092 INFO nova.virt.libvirt.driver [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.680 186092 WARNING nova.virt.libvirt.driver [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec 11 04:46:41 np0005555140 nova_compute[186088]: 2025-12-11 09:46:41.680 186092 DEBUG nova.virt.libvirt.volume.mount [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec 11 04:46:41 np0005555140 python3.9[186706]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 11 04:46:41 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:46:41 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.476 186092 INFO nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Libvirt host capabilities <capabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <host>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <uuid>8f17f30a-28b8-44d4-928b-b22923e377d2</uuid>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <arch>x86_64</arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model>EPYC-Rome-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <vendor>AMD</vendor>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <microcode version='16777317'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <signature family='23' model='49' stepping='0'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <maxphysaddr mode='emulate' bits='40'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='x2apic'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='tsc-deadline'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='osxsave'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='hypervisor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='tsc_adjust'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='spec-ctrl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='stibp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='arch-capabilities'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='cmp_legacy'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='topoext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='virt-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='lbrv'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='tsc-scale'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='vmcb-clean'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='pause-filter'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='pfthreshold'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='svme-addr-chk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='rdctl-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='skip-l1dfl-vmentry'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='mds-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature name='pschange-mc-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <pages unit='KiB' size='4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <pages unit='KiB' size='2048'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <pages unit='KiB' size='1048576'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <power_management>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <suspend_mem/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <suspend_disk/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <suspend_hybrid/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </power_management>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <iommu support='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <migration_features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <live/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <uri_transports>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <uri_transport>tcp</uri_transport>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <uri_transport>rdma</uri_transport>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </uri_transports>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </migration_features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <topology>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <cells num='1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <cell id='0'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <memory unit='KiB'>7864308</memory>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <pages unit='KiB' size='4'>1966077</pages>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <pages unit='KiB' size='2048'>0</pages>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <pages unit='KiB' size='1048576'>0</pages>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <distances>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <sibling id='0' value='10'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          </distances>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          <cpus num='8'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:          </cpus>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        </cell>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </cells>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </topology>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <cache>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </cache>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <secmodel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model>selinux</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <doi>0</doi>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </secmodel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <secmodel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model>dac</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <doi>0</doi>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </secmodel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </host>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <guest>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <os_type>hvm</os_type>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <arch name='i686'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <wordsize>32</wordsize>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <domain type='qemu'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <domain type='kvm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <pae/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <nonpae/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <acpi default='on' toggle='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <apic default='on' toggle='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <cpuselection/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <deviceboot/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <disksnapshot default='on' toggle='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <externalSnapshot/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </guest>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <guest>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <os_type>hvm</os_type>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <arch name='x86_64'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <wordsize>64</wordsize>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <domain type='qemu'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <domain type='kvm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <acpi default='on' toggle='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <apic default='on' toggle='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <cpuselection/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <deviceboot/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <disksnapshot default='on' toggle='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <externalSnapshot/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </guest>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 
Dec 11 04:46:42 np0005555140 nova_compute[186088]: </capabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: #033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.487 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.506 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 11 04:46:42 np0005555140 nova_compute[186088]: <domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <domain>kvm</domain>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <arch>i686</arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <vcpu max='240'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <iothreads supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <os supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='firmware'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <loader supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>rom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pflash</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='readonly'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>yes</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='secure'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </loader>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </os>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='maximumMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <vendor>AMD</vendor>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='succor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='custom' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-128'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-256'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-512'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <memoryBacking supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='sourceType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>anonymous</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>memfd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </memoryBacking>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <disk supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='diskDevice'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>disk</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cdrom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>floppy</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>lun</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ide</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>fdc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>sata</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </disk>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <graphics supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vnc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egl-headless</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </graphics>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <video supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='modelType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vga</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cirrus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>none</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>bochs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ramfb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </video>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hostdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='mode'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>subsystem</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='startupPolicy'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>mandatory</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>requisite</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>optional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='subsysType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pci</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='capsType'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='pciBackend'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hostdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <rng supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>random</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </rng>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <filesystem supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='driverType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>path</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>handle</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtiofs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </filesystem>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <tpm supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-tis</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-crb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emulator</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>external</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendVersion'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>2.0</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </tpm>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <redirdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </redirdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <channel supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </channel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <crypto supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </crypto>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <interface supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>passt</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </interface>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <panic supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>isa</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>hyperv</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </panic>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <console supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>null</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dev</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pipe</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stdio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>udp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tcp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu-vdagent</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </console>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <gic supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <genid supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backup supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <async-teardown supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <ps2 supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sev supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sgx supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hyperv supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='features'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>relaxed</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vapic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>spinlocks</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vpindex</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>runtime</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>synic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stimer</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reset</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vendor_id</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>frequencies</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reenlightenment</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tlbflush</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ipi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>avic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emsr_bitmap</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>xmm_input</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hyperv>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <launchSecurity supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='sectype'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tdx</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </launchSecurity>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: </domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.511 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 11 04:46:42 np0005555140 nova_compute[186088]: <domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <domain>kvm</domain>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <arch>i686</arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <vcpu max='4096'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <iothreads supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <os supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='firmware'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <loader supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>rom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pflash</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='readonly'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>yes</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='secure'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </loader>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </os>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='maximumMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <vendor>AMD</vendor>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='succor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='custom' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-128'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-256'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-512'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <memoryBacking supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='sourceType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>anonymous</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>memfd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </memoryBacking>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <disk supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='diskDevice'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>disk</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cdrom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>floppy</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>lun</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>fdc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>sata</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </disk>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <graphics supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vnc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egl-headless</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </graphics>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <video supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='modelType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vga</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cirrus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>none</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>bochs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ramfb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </video>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hostdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='mode'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>subsystem</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='startupPolicy'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>mandatory</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>requisite</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>optional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='subsysType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pci</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='capsType'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='pciBackend'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hostdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <rng supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>random</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </rng>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <filesystem supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='driverType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>path</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>handle</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtiofs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </filesystem>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <tpm supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-tis</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-crb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emulator</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>external</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendVersion'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>2.0</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </tpm>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <redirdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </redirdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <channel supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </channel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <crypto supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </crypto>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <interface supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>passt</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </interface>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <panic supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>isa</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>hyperv</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </panic>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <console supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>null</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dev</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pipe</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stdio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>udp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tcp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu-vdagent</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </console>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <gic supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <genid supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backup supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <async-teardown supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <ps2 supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sev supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sgx supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hyperv supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='features'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>relaxed</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vapic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>spinlocks</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vpindex</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>runtime</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>synic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stimer</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reset</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vendor_id</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>frequencies</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reenlightenment</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tlbflush</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ipi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>avic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emsr_bitmap</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>xmm_input</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hyperv>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <launchSecurity supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='sectype'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tdx</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </launchSecurity>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: </domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.570 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.574 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 11 04:46:42 np0005555140 nova_compute[186088]: <domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <domain>kvm</domain>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <arch>x86_64</arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <vcpu max='240'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <iothreads supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <os supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='firmware'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <loader supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>rom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pflash</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='readonly'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>yes</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='secure'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </loader>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </os>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='maximumMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <vendor>AMD</vendor>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='succor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='custom' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 python3.9[186939]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-128'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-256'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-512'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 systemd[1]: Stopping nova_compute container...
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <memoryBacking supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='sourceType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>anonymous</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>memfd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </memoryBacking>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <disk supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='diskDevice'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>disk</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cdrom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>floppy</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>lun</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ide</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>fdc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>sata</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </disk>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <graphics supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vnc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egl-headless</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </graphics>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <video supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='modelType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vga</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cirrus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>none</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>bochs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ramfb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </video>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hostdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='mode'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>subsystem</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='startupPolicy'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>mandatory</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>requisite</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>optional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='subsysType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pci</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='capsType'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='pciBackend'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hostdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <rng supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>random</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </rng>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <filesystem supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='driverType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>path</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>handle</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtiofs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </filesystem>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <tpm supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-tis</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-crb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emulator</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>external</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendVersion'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>2.0</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </tpm>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <redirdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </redirdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <channel supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </channel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <crypto supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </crypto>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <interface supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>passt</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </interface>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <panic supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>isa</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>hyperv</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </panic>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <console supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>null</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dev</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pipe</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stdio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>udp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tcp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu-vdagent</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </console>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <gic supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <genid supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backup supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <async-teardown supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <ps2 supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sev supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sgx supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hyperv supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='features'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>relaxed</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vapic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>spinlocks</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vpindex</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>runtime</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>synic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stimer</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reset</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vendor_id</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>frequencies</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reenlightenment</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tlbflush</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ipi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>avic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emsr_bitmap</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>xmm_input</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hyperv>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <launchSecurity supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='sectype'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tdx</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </launchSecurity>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: </domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.662 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 11 04:46:42 np0005555140 nova_compute[186088]: <domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <domain>kvm</domain>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <arch>x86_64</arch>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <vcpu max='4096'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <iothreads supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <os supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='firmware'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>efi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <loader supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>rom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pflash</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='readonly'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>yes</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='secure'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>yes</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>no</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </loader>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </os>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='maximumMigratable'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>on</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>off</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <vendor>AMD</vendor>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='succor'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <mode name='custom' supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Denverton-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='auto-ibrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amd-psfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='stibp-always-on'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='EPYC-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-128'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-256'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx10-512'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='prefetchiti'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Haswell-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512er'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512pf'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fma4'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tbm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xop'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='amx-tile'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-bf16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-fp16'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bitalg'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrc'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fzrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='la57'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='taa-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xfd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ifma'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cmpccxadd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fbsdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='fsrs'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ibrs-all'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mcdt-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pbrsb-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='psdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='serialize'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vaes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='hle'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='rtm'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512bw'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512cd'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512dq'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512f'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='avx512vl'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='invpcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pcid'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='pku'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='mpx'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='core-capability'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='split-lock-detect'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='cldemote'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='erms'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='gfni'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdir64b'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='movdiri'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='xsaves'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='athlon-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='core2duo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='coreduo-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='n270-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='ss'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <blockers model='phenom-v1'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnow'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <feature name='3dnowext'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </blockers>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </mode>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </cpu>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <memoryBacking supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <enum name='sourceType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>anonymous</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <value>memfd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </memoryBacking>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <disk supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='diskDevice'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>disk</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cdrom</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>floppy</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>lun</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>fdc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>sata</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </disk>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <graphics supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vnc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egl-headless</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </graphics>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <video supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='modelType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vga</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>cirrus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>none</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>bochs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ramfb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </video>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hostdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='mode'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>subsystem</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='startupPolicy'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>mandatory</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>requisite</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>optional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='subsysType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pci</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>scsi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='capsType'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='pciBackend'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hostdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <rng supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtio-non-transitional</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>random</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>egd</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </rng>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <filesystem supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='driverType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>path</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>handle</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>virtiofs</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </filesystem>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <tpm supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-tis</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tpm-crb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emulator</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>external</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendVersion'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>2.0</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </tpm>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <redirdev supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='bus'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>usb</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </redirdev>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <channel supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </channel>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <crypto supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendModel'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>builtin</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </crypto>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <interface supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='backendType'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>default</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>passt</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </interface>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <panic supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='model'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>isa</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>hyperv</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </panic>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <console supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='type'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>null</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vc</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pty</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dev</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>file</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>pipe</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stdio</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>udp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tcp</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>unix</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>qemu-vdagent</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>dbus</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </console>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </devices>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  <features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <gic supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <genid supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <backup supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <async-teardown supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <ps2 supported='yes'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sev supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <sgx supported='no'/>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <hyperv supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='features'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>relaxed</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vapic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>spinlocks</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vpindex</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>runtime</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>synic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>stimer</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reset</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>vendor_id</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>frequencies</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>reenlightenment</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tlbflush</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>ipi</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>avic</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>emsr_bitmap</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>xmm_input</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </defaults>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </hyperv>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    <launchSecurity supported='yes'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      <enum name='sectype'>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:        <value>tdx</value>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:      </enum>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:    </launchSecurity>
Dec 11 04:46:42 np0005555140 nova_compute[186088]:  </features>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: </domainCapabilities>
Dec 11 04:46:42 np0005555140 nova_compute[186088]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.749 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.749 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.749 186092 DEBUG nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.750 186092 INFO nova.virt.libvirt.host [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Secure Boot support detected#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.752 186092 INFO nova.virt.libvirt.driver [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.752 186092 INFO nova.virt.libvirt.driver [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.760 186092 DEBUG nova.virt.libvirt.driver [None req-9f870157-3cc0-463e-b32e-8a9089cc05e7 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.811 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.811 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:46:42 np0005555140 nova_compute[186088]: 2025-12-11 09:46:42.812 186092 DEBUG oslo_concurrency.lockutils [None req-22e6cd08-8f57-48e9-b8ca-e22129df3e18 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:46:43 np0005555140 virtqemud[186728]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 11 04:46:43 np0005555140 virtqemud[186728]: hostname: compute-0
Dec 11 04:46:43 np0005555140 virtqemud[186728]: End of file while reading data: Input/output error
Dec 11 04:46:43 np0005555140 systemd[1]: libpod-b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507.scope: Deactivated successfully.
Dec 11 04:46:43 np0005555140 systemd[1]: libpod-b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507.scope: Consumed 3.093s CPU time.
Dec 11 04:46:43 np0005555140 podman[186947]: 2025-12-11 09:46:43.233063093 +0000 UTC m=+0.482711949 container died b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 04:46:43 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507-userdata-shm.mount: Deactivated successfully.
Dec 11 04:46:43 np0005555140 systemd[1]: var-lib-containers-storage-overlay-c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120-merged.mount: Deactivated successfully.
Dec 11 04:46:43 np0005555140 podman[186947]: 2025-12-11 09:46:43.294435087 +0000 UTC m=+0.544083943 container cleanup b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 04:46:43 np0005555140 podman[186947]: nova_compute
Dec 11 04:46:43 np0005555140 podman[186977]: nova_compute
Dec 11 04:46:43 np0005555140 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 11 04:46:43 np0005555140 systemd[1]: Stopped nova_compute container.
Dec 11 04:46:43 np0005555140 systemd[1]: Starting nova_compute container...
Dec 11 04:46:43 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:46:43 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:43 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:43 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:43 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:43 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8b699264d246ca9d36aca58b644d8eb9f663e8a37103807107462faa234f120/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:43 np0005555140 podman[186990]: 2025-12-11 09:46:43.461435959 +0000 UTC m=+0.081323180 container init b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 04:46:43 np0005555140 podman[186990]: 2025-12-11 09:46:43.475990837 +0000 UTC m=+0.095878028 container start b377c2ff9fa747583cb3416129a4ac1eb39c269a89a6b8796be2ee373396d507 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.schema-version=1.0)
Dec 11 04:46:43 np0005555140 podman[186990]: nova_compute
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + sudo -E kolla_set_configs
Dec 11 04:46:43 np0005555140 systemd[1]: Started nova_compute container.
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Validating config file
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying service configuration files
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /etc/ceph
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Creating directory /etc/ceph
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /etc/ceph
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Writing out command to execute
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:43 np0005555140 nova_compute[187006]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 04:46:43 np0005555140 nova_compute[187006]: ++ cat /run_command
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + CMD=nova-compute
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + ARGS=
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + sudo kolla_copy_cacerts
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + [[ ! -n '' ]]
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + . kolla_extend_start
Dec 11 04:46:43 np0005555140 nova_compute[187006]: Running command: 'nova-compute'
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + echo 'Running command: '\''nova-compute'\'''
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + umask 0022
Dec 11 04:46:43 np0005555140 nova_compute[187006]: + exec nova-compute
Dec 11 04:46:44 np0005555140 python3.9[187169]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 11 04:46:44 np0005555140 systemd[1]: Started libpod-conmon-98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3.scope.
Dec 11 04:46:44 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:46:44 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde85d799814b5fd1d2b4476fa6586252db54e6b2ece5d650059450e047ad638/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:44 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde85d799814b5fd1d2b4476fa6586252db54e6b2ece5d650059450e047ad638/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:44 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cde85d799814b5fd1d2b4476fa6586252db54e6b2ece5d650059450e047ad638/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 11 04:46:44 np0005555140 podman[187194]: 2025-12-11 09:46:44.460826141 +0000 UTC m=+0.134293121 container init 98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm)
Dec 11 04:46:44 np0005555140 podman[187194]: 2025-12-11 09:46:44.468508132 +0000 UTC m=+0.141975082 container start 98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init)
Dec 11 04:46:44 np0005555140 python3.9[187169]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Applying nova statedir ownership
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 11 04:46:44 np0005555140 nova_compute_init[187218]: INFO:nova_statedir:Nova statedir ownership complete
Dec 11 04:46:44 np0005555140 systemd[1]: libpod-98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3.scope: Deactivated successfully.
Dec 11 04:46:44 np0005555140 podman[187232]: 2025-12-11 09:46:44.558668144 +0000 UTC m=+0.021199210 container died 98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:46:44 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3-userdata-shm.mount: Deactivated successfully.
Dec 11 04:46:44 np0005555140 systemd[1]: var-lib-containers-storage-overlay-cde85d799814b5fd1d2b4476fa6586252db54e6b2ece5d650059450e047ad638-merged.mount: Deactivated successfully.
Dec 11 04:46:44 np0005555140 podman[187232]: 2025-12-11 09:46:44.599042935 +0000 UTC m=+0.061573991 container cleanup 98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:46:44 np0005555140 systemd[1]: libpod-conmon-98f86e8b73ea1dc01c89078485dcdee45428ab22607992e886d71dba3aec0fa3.scope: Deactivated successfully.
Dec 11 04:46:45 np0005555140 systemd[1]: session-23.scope: Deactivated successfully.
Dec 11 04:46:45 np0005555140 systemd[1]: session-23.scope: Consumed 1min 51.267s CPU time.
Dec 11 04:46:45 np0005555140 systemd-logind[787]: Session 23 logged out. Waiting for processes to exit.
Dec 11 04:46:45 np0005555140 systemd-logind[787]: Removed session 23.
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.641 187010 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.642 187010 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.642 187010 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.642 187010 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.795 187010 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.819 187010 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:46:45 np0005555140 nova_compute[187006]: 2025-12-11 09:46:45.819 187010 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.778 187010 INFO nova.virt.driver [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.878 187010 INFO nova.compute.provider_config [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.892 187010 DEBUG oslo_concurrency.lockutils [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.892 187010 DEBUG oslo_concurrency.lockutils [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.893 187010 DEBUG oslo_concurrency.lockutils [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.893 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.893 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.893 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.893 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.894 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.895 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.896 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.897 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.898 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.899 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.900 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.901 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.902 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.903 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.903 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.903 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.903 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.903 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.904 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.904 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.904 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.904 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.904 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.905 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.906 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.907 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.908 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.909 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.910 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.911 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.912 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.913 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.914 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.915 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.916 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.917 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.918 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.919 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.920 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.921 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.922 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.923 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.923 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.923 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.923 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.923 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.924 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.925 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.926 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.927 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.928 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.929 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.930 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.931 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.932 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.932 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.932 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.932 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.932 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.933 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.934 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.935 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.936 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.936 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.936 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.936 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.936 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.937 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.938 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.939 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.940 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.941 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.941 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.941 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.941 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.941 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.942 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.943 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.944 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.944 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.944 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.944 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.944 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.945 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.945 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.945 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.945 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.945 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.946 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.946 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.946 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.946 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.946 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.947 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.948 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.948 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.948 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.948 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.948 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.949 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.949 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.949 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.949 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.950 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.950 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.950 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.950 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.950 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.951 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.951 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.951 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.951 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.951 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.952 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.953 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.954 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.955 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.956 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.956 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.956 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.956 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.956 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.957 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.958 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.959 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.960 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.961 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.962 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.962 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.962 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.962 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.963 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.964 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.964 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.964 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.964 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.964 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.965 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.965 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.965 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.965 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.965 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.966 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.967 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.968 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.968 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.968 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.968 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.969 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.970 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.971 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.972 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.972 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.972 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.972 187010 WARNING oslo_config.cfg [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 11 04:46:46 np0005555140 nova_compute[187006]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 11 04:46:46 np0005555140 nova_compute[187006]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 11 04:46:46 np0005555140 nova_compute[187006]: and ``live_migration_inbound_addr`` respectively.
Dec 11 04:46:46 np0005555140 nova_compute[187006]: ).  Its value may be silently ignored in the future.#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.973 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.974 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.975 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.976 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.977 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.978 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.978 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.978 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.978 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.978 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.979 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.980 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.981 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.982 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.982 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.982 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.982 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.982 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.983 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.983 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.983 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.983 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.983 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.984 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.985 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.985 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.985 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.985 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.985 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.986 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.987 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.987 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.987 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.987 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.987 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.988 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.989 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.989 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.989 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.989 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.989 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.990 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.991 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.992 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.993 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.994 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.995 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.996 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.997 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.998 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:46 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:46.999 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.000 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.001 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.002 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.002 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.002 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.002 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.003 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.003 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.003 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.003 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.003 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.004 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.005 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.005 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.005 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.005 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.005 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.006 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.007 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.008 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.009 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.010 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.011 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.012 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.013 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.014 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.015 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.016 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.017 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.018 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.019 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.020 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.021 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.022 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.023 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.024 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.025 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.026 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.026 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.026 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.026 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.026 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.027 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.028 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.029 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.030 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.031 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.032 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.033 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.034 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.035 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.035 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.035 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.035 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.035 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.036 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.037 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.038 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.038 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.038 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.038 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.038 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.039 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.040 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.040 187010 DEBUG oslo_service.service [None req-7b056ca2-07c3-4591-8d6d-162f698d2163 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.041 187010 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.056 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.057 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.057 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.057 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.071 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f36b35e6850> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.074 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f36b35e6850> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.075 187010 INFO nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.082 187010 INFO nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Libvirt host capabilities <capabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <host>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <uuid>8f17f30a-28b8-44d4-928b-b22923e377d2</uuid>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <arch>x86_64</arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model>EPYC-Rome-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <vendor>AMD</vendor>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <microcode version='16777317'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <signature family='23' model='49' stepping='0'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <maxphysaddr mode='emulate' bits='40'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='x2apic'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='tsc-deadline'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='osxsave'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='hypervisor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='tsc_adjust'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='spec-ctrl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='stibp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='arch-capabilities'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='cmp_legacy'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='topoext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='virt-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='lbrv'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='tsc-scale'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='vmcb-clean'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='pause-filter'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='pfthreshold'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='svme-addr-chk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='rdctl-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='skip-l1dfl-vmentry'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='mds-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature name='pschange-mc-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <pages unit='KiB' size='4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <pages unit='KiB' size='2048'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <pages unit='KiB' size='1048576'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <power_management>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <suspend_mem/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <suspend_disk/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <suspend_hybrid/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </power_management>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <iommu support='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <migration_features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <live/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <uri_transports>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <uri_transport>tcp</uri_transport>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <uri_transport>rdma</uri_transport>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </uri_transports>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </migration_features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <topology>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <cells num='1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <cell id='0'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <memory unit='KiB'>7864308</memory>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <pages unit='KiB' size='4'>1966077</pages>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <pages unit='KiB' size='2048'>0</pages>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <pages unit='KiB' size='1048576'>0</pages>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <distances>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <sibling id='0' value='10'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          </distances>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          <cpus num='8'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:          </cpus>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        </cell>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </cells>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </topology>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <cache>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </cache>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <secmodel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model>selinux</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <doi>0</doi>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </secmodel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <secmodel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model>dac</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <doi>0</doi>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </secmodel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </host>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <guest>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <os_type>hvm</os_type>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <arch name='i686'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <wordsize>32</wordsize>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <domain type='qemu'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <domain type='kvm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <pae/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <nonpae/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <acpi default='on' toggle='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <apic default='on' toggle='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <cpuselection/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <deviceboot/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <disksnapshot default='on' toggle='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <externalSnapshot/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </guest>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <guest>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <os_type>hvm</os_type>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <arch name='x86_64'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <wordsize>64</wordsize>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <domain type='qemu'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <domain type='kvm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <acpi default='on' toggle='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <apic default='on' toggle='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <cpuselection/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <deviceboot/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <disksnapshot default='on' toggle='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <externalSnapshot/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </guest>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 
Dec 11 04:46:47 np0005555140 nova_compute[187006]: </capabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: #033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.087 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.090 187010 WARNING nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.090 187010 DEBUG nova.virt.libvirt.volume.mount [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.093 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 11 04:46:47 np0005555140 nova_compute[187006]: <domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <domain>kvm</domain>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <arch>i686</arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <vcpu max='4096'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <iothreads supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <os supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='firmware'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <loader supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>rom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pflash</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='readonly'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>yes</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='secure'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </loader>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='maximumMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <vendor>AMD</vendor>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='succor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='custom' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-128'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-256'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-512'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <memoryBacking supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='sourceType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>anonymous</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>memfd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </memoryBacking>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <disk supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='diskDevice'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>disk</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cdrom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>floppy</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>lun</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>fdc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>sata</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <graphics supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vnc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egl-headless</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <video supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='modelType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vga</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cirrus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>none</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>bochs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ramfb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hostdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='mode'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>subsystem</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='startupPolicy'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>mandatory</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>requisite</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>optional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='subsysType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pci</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='capsType'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='pciBackend'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hostdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <rng supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>random</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <filesystem supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='driverType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>path</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>handle</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtiofs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </filesystem>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <tpm supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-tis</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-crb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emulator</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>external</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendVersion'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>2.0</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </tpm>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <redirdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </redirdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <channel supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </channel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <crypto supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </crypto>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <interface supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>passt</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <panic supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>isa</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>hyperv</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </panic>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <console supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>null</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dev</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pipe</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stdio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>udp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tcp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu-vdagent</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <gic supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <genid supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backup supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <async-teardown supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <ps2 supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sev supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sgx supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hyperv supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='features'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>relaxed</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vapic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>spinlocks</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vpindex</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>runtime</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>synic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stimer</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reset</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vendor_id</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>frequencies</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reenlightenment</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tlbflush</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ipi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>avic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emsr_bitmap</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>xmm_input</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hyperv>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <launchSecurity supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='sectype'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tdx</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </launchSecurity>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: </domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.100 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 11 04:46:47 np0005555140 nova_compute[187006]: <domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <domain>kvm</domain>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <arch>i686</arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <vcpu max='240'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <iothreads supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <os supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='firmware'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <loader supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>rom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pflash</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='readonly'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>yes</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='secure'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </loader>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='maximumMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <vendor>AMD</vendor>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='succor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='custom' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-128'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-256'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-512'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <memoryBacking supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='sourceType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>anonymous</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>memfd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </memoryBacking>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <disk supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='diskDevice'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>disk</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cdrom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>floppy</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>lun</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ide</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>fdc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>sata</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <graphics supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vnc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egl-headless</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <video supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='modelType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vga</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cirrus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>none</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>bochs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ramfb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hostdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='mode'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>subsystem</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='startupPolicy'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>mandatory</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>requisite</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>optional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='subsysType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pci</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='capsType'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='pciBackend'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hostdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <rng supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>random</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <filesystem supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='driverType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>path</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>handle</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtiofs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </filesystem>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <tpm supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-tis</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-crb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emulator</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>external</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendVersion'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>2.0</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </tpm>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <redirdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </redirdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <channel supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </channel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <crypto supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </crypto>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <interface supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>passt</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <panic supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>isa</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>hyperv</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </panic>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <console supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>null</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dev</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pipe</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stdio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>udp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tcp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu-vdagent</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <gic supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <genid supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backup supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <async-teardown supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <ps2 supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sev supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sgx supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hyperv supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='features'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>relaxed</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vapic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>spinlocks</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vpindex</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>runtime</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>synic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stimer</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reset</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vendor_id</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>frequencies</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reenlightenment</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tlbflush</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ipi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>avic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emsr_bitmap</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>xmm_input</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hyperv>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <launchSecurity supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='sectype'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tdx</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </launchSecurity>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: </domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.130 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.133 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 11 04:46:47 np0005555140 nova_compute[187006]: <domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <domain>kvm</domain>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <arch>x86_64</arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <vcpu max='4096'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <iothreads supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <os supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='firmware'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>efi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <loader supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>rom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pflash</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='readonly'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>yes</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='secure'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>yes</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </loader>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='maximumMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <vendor>AMD</vendor>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='succor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='custom' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-128'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-256'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-512'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <memoryBacking supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='sourceType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>anonymous</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>memfd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </memoryBacking>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <disk supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='diskDevice'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>disk</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cdrom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>floppy</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>lun</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>fdc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>sata</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <graphics supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vnc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egl-headless</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <video supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='modelType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vga</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cirrus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>none</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>bochs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ramfb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hostdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='mode'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>subsystem</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='startupPolicy'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>mandatory</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>requisite</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>optional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='subsysType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pci</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='capsType'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='pciBackend'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hostdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <rng supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>random</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <filesystem supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='driverType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>path</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>handle</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtiofs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </filesystem>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <tpm supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-tis</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-crb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emulator</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>external</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendVersion'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>2.0</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </tpm>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <redirdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </redirdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <channel supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </channel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <crypto supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </crypto>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <interface supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>passt</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <panic supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>isa</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>hyperv</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </panic>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <console supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>null</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dev</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pipe</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stdio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>udp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tcp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu-vdagent</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <gic supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <genid supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backup supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <async-teardown supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <ps2 supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sev supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sgx supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hyperv supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='features'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>relaxed</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vapic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>spinlocks</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vpindex</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>runtime</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>synic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stimer</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reset</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vendor_id</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>frequencies</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reenlightenment</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tlbflush</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ipi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>avic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emsr_bitmap</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>xmm_input</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hyperv>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <launchSecurity supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='sectype'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tdx</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </launchSecurity>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: </domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.196 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 11 04:46:47 np0005555140 nova_compute[187006]: <domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <domain>kvm</domain>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <arch>x86_64</arch>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <vcpu max='240'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <iothreads supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <os supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='firmware'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <loader supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>rom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pflash</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='readonly'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>yes</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='secure'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>no</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </loader>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-passthrough' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='hostPassthroughMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='maximum' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='maximumMigratable'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>on</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>off</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='host-model' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <vendor>AMD</vendor>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='x2apic'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='hypervisor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='stibp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='overflow-recov'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='succor'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lbrv'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='tsc-scale'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='flushbyasid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pause-filter'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='pfthreshold'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <feature policy='disable' name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <mode name='custom' supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Broadwell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Cooperlake-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Denverton-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Dhyana-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='auto-ibrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Milan-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amd-psfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='no-nested-data-bp'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='null-sel-clr-base'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='stibp-always-on'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-Rome-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='EPYC-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='GraniteRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-128'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-256'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx10-512'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='prefetchiti'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Haswell-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v6'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Icelake-Server-v7'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='IvyBridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='KnightsMill-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4fmaps'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-4vnniw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512er'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512pf'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G4-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Opteron_G5-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fma4'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tbm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xop'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SapphireRapids-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='amx-tile'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-bf16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-fp16'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512-vpopcntdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bitalg'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vbmi2'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrc'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fzrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='la57'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='taa-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='tsx-ldtrk'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xfd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='SierraForest-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ifma'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-ne-convert'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx-vnni-int8'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='bus-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cmpccxadd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fbsdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='fsrs'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ibrs-all'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mcdt-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pbrsb-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='psdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='sbdr-ssdp-no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='serialize'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vaes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='vpclmulqdq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Client-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='hle'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='rtm'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Skylake-Server-v5'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512bw'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512cd'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512dq'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512f'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='avx512vl'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='invpcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pcid'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='pku'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='mpx'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v2'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v3'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='core-capability'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='split-lock-detect'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='Snowridge-v4'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='cldemote'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='erms'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='gfni'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdir64b'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='movdiri'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='xsaves'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='athlon-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='core2duo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='coreduo-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='n270-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='ss'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <blockers model='phenom-v1'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnow'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <feature name='3dnowext'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </blockers>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </mode>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <memoryBacking supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <enum name='sourceType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>anonymous</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <value>memfd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </memoryBacking>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <disk supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='diskDevice'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>disk</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cdrom</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>floppy</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>lun</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ide</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>fdc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>sata</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <graphics supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vnc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egl-headless</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <video supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='modelType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vga</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>cirrus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>none</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>bochs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ramfb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hostdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='mode'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>subsystem</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='startupPolicy'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>mandatory</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>requisite</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>optional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='subsysType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pci</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>scsi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='capsType'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='pciBackend'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hostdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <rng supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtio-non-transitional</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>random</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>egd</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <filesystem supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='driverType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>path</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>handle</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>virtiofs</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </filesystem>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <tpm supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-tis</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tpm-crb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emulator</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>external</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendVersion'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>2.0</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </tpm>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <redirdev supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='bus'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>usb</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </redirdev>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <channel supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </channel>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <crypto supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendModel'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>builtin</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </crypto>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <interface supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='backendType'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>default</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>passt</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <panic supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='model'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>isa</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>hyperv</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </panic>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <console supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='type'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>null</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vc</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pty</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dev</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>file</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>pipe</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stdio</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>udp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tcp</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>unix</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>qemu-vdagent</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>dbus</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <gic supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <vmcoreinfo supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <genid supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backingStoreInput supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <backup supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <async-teardown supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <ps2 supported='yes'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sev supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <sgx supported='no'/>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <hyperv supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='features'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>relaxed</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vapic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>spinlocks</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vpindex</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>runtime</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>synic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>stimer</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reset</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>vendor_id</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>frequencies</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>reenlightenment</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tlbflush</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>ipi</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>avic</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>emsr_bitmap</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>xmm_input</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <spinlocks>4095</spinlocks>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <stimer_direct>on</stimer_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </defaults>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </hyperv>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    <launchSecurity supported='yes'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      <enum name='sectype'>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:        <value>tdx</value>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:      </enum>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:    </launchSecurity>
Dec 11 04:46:47 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: </domainCapabilities>
Dec 11 04:46:47 np0005555140 nova_compute[187006]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.264 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.264 187010 INFO nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Secure Boot support detected#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.267 187010 INFO nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.268 187010 INFO nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.279 187010 DEBUG nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.310 187010 INFO nova.virt.node [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Determined node identity da0ef57a-f24e-4679-bba9-2f0d52d82a56 from /var/lib/nova/compute_id#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.333 187010 WARNING nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Compute nodes ['da0ef57a-f24e-4679-bba9-2f0d52d82a56'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.371 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.405 187010 WARNING nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.406 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.406 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.406 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.406 187010 DEBUG nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:46:47 np0005555140 systemd[1]: Starting libvirt nodedev daemon...
Dec 11 04:46:47 np0005555140 systemd[1]: Started libvirt nodedev daemon.
Dec 11 04:46:47 np0005555140 podman[187329]: 2025-12-11 09:46:47.696997272 +0000 UTC m=+0.068887332 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.729 187010 WARNING nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.730 187010 DEBUG nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6129MB free_disk=73.53607559204102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.730 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.731 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.761 187010 WARNING nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] No compute node record for compute-0.ctlplane.example.com:da0ef57a-f24e-4679-bba9-2f0d52d82a56: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host da0ef57a-f24e-4679-bba9-2f0d52d82a56 could not be found.#033[00m
Dec 11 04:46:47 np0005555140 nova_compute[187006]: 2025-12-11 09:46:47.787 187010 INFO nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: da0ef57a-f24e-4679-bba9-2f0d52d82a56#033[00m
Dec 11 04:46:48 np0005555140 nova_compute[187006]: 2025-12-11 09:46:48.269 187010 DEBUG nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:46:48 np0005555140 nova_compute[187006]: 2025-12-11 09:46:48.269 187010 DEBUG nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:46:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:46:48.610 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:46:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:46:48.610 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:46:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:46:48.610 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.197 187010 INFO nova.scheduler.client.report [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [req-3c825866-7a87-4f1e-bc4a-13dbd96e16aa] Created resource provider record via placement API for resource provider with UUID da0ef57a-f24e-4679-bba9-2f0d52d82a56 and name compute-0.ctlplane.example.com.#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.610 187010 DEBUG nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 11 04:46:49 np0005555140 nova_compute[187006]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.610 187010 INFO nova.virt.libvirt.host [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.611 187010 DEBUG nova.compute.provider_tree [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.611 187010 DEBUG nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.916 187010 DEBUG nova.scheduler.client.report [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Updated inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.916 187010 DEBUG nova.compute.provider_tree [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Updating resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 04:46:49 np0005555140 nova_compute[187006]: 2025-12-11 09:46:49.916 187010 DEBUG nova.compute.provider_tree [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.160 187010 DEBUG nova.compute.provider_tree [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Updating resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.182 187010 DEBUG nova.compute.resource_tracker [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.183 187010 DEBUG oslo_concurrency.lockutils [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.183 187010 DEBUG nova.service [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.736 187010 DEBUG nova.service [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec 11 04:46:50 np0005555140 nova_compute[187006]: 2025-12-11 09:46:50.737 187010 DEBUG nova.servicegroup.drivers.db [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec 11 04:46:52 np0005555140 systemd-logind[787]: New session 25 of user zuul.
Dec 11 04:46:52 np0005555140 systemd[1]: Started Session 25 of User zuul.
Dec 11 04:46:54 np0005555140 python3.9[187502]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 04:46:54 np0005555140 podman[187533]: 2025-12-11 09:46:54.702054088 +0000 UTC m=+0.068102359 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 04:46:55 np0005555140 python3.9[187677]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:46:55 np0005555140 systemd[1]: Reloading.
Dec 11 04:46:55 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:46:55 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:46:56 np0005555140 python3.9[187863]: ansible-ansible.builtin.service_facts Invoked
Dec 11 04:46:56 np0005555140 network[187880]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 04:46:56 np0005555140 network[187881]: 'network-scripts' will be removed from distribution in near future.
Dec 11 04:46:56 np0005555140 network[187882]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 04:47:01 np0005555140 python3.9[188156]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:47:02 np0005555140 python3.9[188309]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:02 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:47:02 np0005555140 podman[188332]: 2025-12-11 09:47:02.713612432 +0000 UTC m=+0.083328187 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:47:03 np0005555140 python3.9[188488]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:03 np0005555140 python3.9[188640]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:47:04 np0005555140 python3.9[188792]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:47:05 np0005555140 python3.9[188944]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:47:05 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:05 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:05 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:06 np0005555140 python3.9[189131]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:47:07 np0005555140 python3.9[189284]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:08 np0005555140 python3.9[189434]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:47:08 np0005555140 python3.9[189586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:09 np0005555140 python3.9[189707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446428.2649078-133-208874176428169/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:10 np0005555140 python3.9[189859]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 11 04:47:11 np0005555140 python3.9[190011]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 11 04:47:11 np0005555140 python3.9[190164]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 04:47:12 np0005555140 python3.9[190322]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 04:47:14 np0005555140 python3.9[190480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:14 np0005555140 python3.9[190601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765446433.614755-201-91403135652707/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:15 np0005555140 python3.9[190751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:15 np0005555140 python3.9[190872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765446434.713372-201-36322093873528/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:15 np0005555140 nova_compute[187006]: 2025-12-11 09:47:15.738 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:16 np0005555140 python3.9[191022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:16 np0005555140 nova_compute[187006]: 2025-12-11 09:47:16.455 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:16 np0005555140 python3.9[191143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765446435.8112853-201-17066011936129/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:17 np0005555140 python3.9[191293]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:47:17 np0005555140 python3.9[191445]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:47:18 np0005555140 podman[191571]: 2025-12-11 09:47:18.438031721 +0000 UTC m=+0.065466053 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:47:18 np0005555140 python3.9[191607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:19 np0005555140 python3.9[191739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446438.0942621-260-47955000734001/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:19 np0005555140 python3.9[191889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:20 np0005555140 python3.9[191965]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:20 np0005555140 python3.9[192115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:21 np0005555140 python3.9[192236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446440.2570384-260-151929268131115/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=efaf47016fe7b26b0fac1859fb210fa893f9f461 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:21 np0005555140 python3.9[192386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:22 np0005555140 python3.9[192507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446441.3994286-260-208803505029553/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:23 np0005555140 python3.9[192657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:23 np0005555140 python3.9[192778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446442.622302-260-230791190694973/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:24 np0005555140 python3.9[192928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:24 np0005555140 python3.9[193049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446443.8319976-260-177191753692171/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:25 np0005555140 podman[193050]: 2025-12-11 09:47:25.002629685 +0000 UTC m=+0.057541576 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Dec 11 04:47:25 np0005555140 python3.9[193220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:26 np0005555140 python3.9[193341]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446445.0819407-260-175505467959167/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:26 np0005555140 python3.9[193491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:27 np0005555140 python3.9[193612]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446446.2782001-260-228171315905101/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:27 np0005555140 python3.9[193762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:28 np0005555140 python3.9[193883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446447.428419-260-262746296283025/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:28 np0005555140 python3.9[194033]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:29 np0005555140 python3.9[194154]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446448.4896352-260-96448796662956/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:30 np0005555140 python3.9[194304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:30 np0005555140 python3.9[194425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446449.6206532-260-76458460657362/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:31 np0005555140 python3.9[194575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:31 np0005555140 python3.9[194651]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:32 np0005555140 python3.9[194801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:32 np0005555140 python3.9[194877]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:32 np0005555140 podman[194878]: 2025-12-11 09:47:32.883760859 +0000 UTC m=+0.081151153 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:47:33 np0005555140 python3.9[195054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:33 np0005555140 python3.9[195130]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:34 np0005555140 python3.9[195282]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:35 np0005555140 python3.9[195434]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:35 np0005555140 python3.9[195586]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:36 np0005555140 python3.9[195738]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:47:36 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:36 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:36 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:37 np0005555140 systemd[1]: Listening on Podman API Socket.
Dec 11 04:47:37 np0005555140 python3.9[195929]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:38 np0005555140 python3.9[196052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446457.3711772-482-128396592876061/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:38 np0005555140 python3.9[196128]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:39 np0005555140 python3.9[196251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446457.3711772-482-128396592876061/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:40 np0005555140 python3.9[196403]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 11 04:47:41 np0005555140 python3.9[196555]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:47:42 np0005555140 python3[196707]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:47:42 np0005555140 podman[196745]: 2025-12-11 09:47:42.747829882 +0000 UTC m=+0.054925533 container create cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Dec 11 04:47:42 np0005555140 podman[196745]: 2025-12-11 09:47:42.716450074 +0000 UTC m=+0.023545735 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf
Dec 11 04:47:42 np0005555140 python3[196707]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf kolla_start
Dec 11 04:47:43 np0005555140 python3.9[196935]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:47:44 np0005555140 python3.9[197089]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:44 np0005555140 python3.9[197240]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446464.3562195-546-245559660509899/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.832 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.832 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.832 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.963 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.964 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.964 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.964 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.965 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.965 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.965 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.965 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:47:45 np0005555140 nova_compute[187006]: 2025-12-11 09:47:45.965 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.053 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.053 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.053 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.053 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:47:46 np0005555140 python3.9[197316]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:47:46 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.225 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.226 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6094MB free_disk=73.53487396240234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.226 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.227 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:47:46 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:46 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.292 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.293 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.322 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.335 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.337 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:47:46 np0005555140 nova_compute[187006]: 2025-12-11 09:47:46.337 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:47:46 np0005555140 python3.9[197427]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:47:47 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:47 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:47 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:47 np0005555140 systemd[1]: Starting ceilometer_agent_compute container...
Dec 11 04:47:47 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:47:47 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:47 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:47 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:47 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:47 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.
Dec 11 04:47:47 np0005555140 podman[197468]: 2025-12-11 09:47:47.672597359 +0000 UTC m=+0.292252755 container init cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + sudo -E kolla_set_configs
Dec 11 04:47:47 np0005555140 podman[197468]: 2025-12-11 09:47:47.726658407 +0000 UTC m=+0.346313803 container start cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: sudo: unable to send audit message: Operation not permitted
Dec 11 04:47:47 np0005555140 podman[197468]: ceilometer_agent_compute
Dec 11 04:47:47 np0005555140 systemd[1]: Started ceilometer_agent_compute container.
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Validating config file
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Copying service configuration files
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: INFO:__main__:Writing out command to execute
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: ++ cat /run_command
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + ARGS=
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + sudo kolla_copy_cacerts
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: sudo: unable to send audit message: Operation not permitted
Dec 11 04:47:47 np0005555140 podman[197490]: 2025-12-11 09:47:47.812592986 +0000 UTC m=+0.074101472 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:47:47 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-271ee403997a4d42.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + [[ ! -n '' ]]
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + . kolla_extend_start
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 11 04:47:47 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-271ee403997a4d42.service: Failed with result 'exit-code'.
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + umask 0022
Dec 11 04:47:47 np0005555140 ceilometer_agent_compute[197483]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 11 04:47:48 np0005555140 python3.9[197664]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:47:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:47:48.611 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:47:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:47:48.613 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:47:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:47:48.613 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:47:48 np0005555140 systemd[1]: Stopping ceilometer_agent_compute container...
Dec 11 04:47:48 np0005555140 podman[197666]: 2025-12-11 09:47:48.666561907 +0000 UTC m=+0.058359091 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:47:48 np0005555140 systemd[1]: libpod-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope: Deactivated successfully.
Dec 11 04:47:48 np0005555140 systemd[1]: libpod-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope: Consumed 1.001s CPU time.
Dec 11 04:47:48 np0005555140 podman[197674]: 2025-12-11 09:47:48.680131845 +0000 UTC m=+0.041527439 container died cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:47:48 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-271ee403997a4d42.timer: Deactivated successfully.
Dec 11 04:47:48 np0005555140 systemd[1]: Stopped /usr/bin/podman healthcheck run cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.
Dec 11 04:47:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-userdata-shm.mount: Deactivated successfully.
Dec 11 04:47:48 np0005555140 systemd[1]: var-lib-containers-storage-overlay-13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e-merged.mount: Deactivated successfully.
Dec 11 04:47:48 np0005555140 podman[197674]: 2025-12-11 09:47:48.724214657 +0000 UTC m=+0.085610251 container cleanup cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 04:47:48 np0005555140 podman[197674]: ceilometer_agent_compute
Dec 11 04:47:48 np0005555140 podman[197716]: ceilometer_agent_compute
Dec 11 04:47:48 np0005555140 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 11 04:47:48 np0005555140 systemd[1]: Stopped ceilometer_agent_compute container.
Dec 11 04:47:48 np0005555140 systemd[1]: Starting ceilometer_agent_compute container...
Dec 11 04:47:48 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:47:48 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:48 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:48 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:48 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b0a68829707c31ddaf31817ceb9c4e742c4ce96ad3ce6e95242b29d6e78e6e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:48 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.
Dec 11 04:47:48 np0005555140 podman[197729]: 2025-12-11 09:47:48.93362235 +0000 UTC m=+0.109544236 container init cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:47:48 np0005555140 ceilometer_agent_compute[197744]: + sudo -E kolla_set_configs
Dec 11 04:47:48 np0005555140 ceilometer_agent_compute[197744]: sudo: unable to send audit message: Operation not permitted
Dec 11 04:47:48 np0005555140 podman[197729]: 2025-12-11 09:47:48.961170059 +0000 UTC m=+0.137091915 container start cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:47:48 np0005555140 podman[197729]: ceilometer_agent_compute
Dec 11 04:47:48 np0005555140 systemd[1]: Started ceilometer_agent_compute container.
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Validating config file
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Copying service configuration files
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: INFO:__main__:Writing out command to execute
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: ++ cat /run_command
Dec 11 04:47:49 np0005555140 podman[197751]: 2025-12-11 09:47:49.017652315 +0000 UTC m=+0.044775602 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + ARGS=
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + sudo kolla_copy_cacerts
Dec 11 04:47:49 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-67286584d46ffac0.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:47:49 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-67286584d46ffac0.service: Failed with result 'exit-code'.
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: sudo: unable to send audit message: Operation not permitted
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + [[ ! -n '' ]]
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + . kolla_extend_start
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + umask 0022
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 11 04:47:49 np0005555140 python3.9[197928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.860 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.861 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.862 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.863 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.864 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.865 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.866 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.867 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.872 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.874 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.875 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.876 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.877 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.877 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.896 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.898 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 11 04:47:49 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:49.899 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.032 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.128 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.128 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.128 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.129 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.130 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.131 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.132 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.133 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.134 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.135 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.136 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.137 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.138 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.139 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.140 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.141 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.142 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.143 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.144 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 python3.9[198054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446469.158488-578-208544639760771/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.145 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.146 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.147 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.148 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.149 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.153 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.161 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:47:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:47:50 np0005555140 python3.9[198209]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 11 04:47:51 np0005555140 python3.9[198361]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:47:52 np0005555140 python3[198513]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:47:52 np0005555140 podman[198547]: 2025-12-11 09:47:52.598695036 +0000 UTC m=+0.044006440 container create 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Dec 11 04:47:52 np0005555140 podman[198547]: 2025-12-11 09:47:52.573965878 +0000 UTC m=+0.019277312 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 11 04:47:52 np0005555140 python3[198513]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 11 04:47:53 np0005555140 python3.9[198738]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:47:54 np0005555140 python3.9[198892]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:54 np0005555140 python3.9[199043]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446474.2484975-631-84281886320291/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:47:55 np0005555140 podman[199091]: 2025-12-11 09:47:55.165642002 +0000 UTC m=+0.062224922 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 04:47:55 np0005555140 python3.9[199135]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:47:55 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:55 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:55 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:56 np0005555140 python3.9[199249]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:47:56 np0005555140 systemd[1]: Reloading.
Dec 11 04:47:56 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:47:56 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:47:56 np0005555140 systemd[1]: Starting node_exporter container...
Dec 11 04:47:56 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:47:57 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8da537fb3ed949e3f5460891c7f44b525967df68b926ba2a28d4e1cd4b93d5f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:57 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8da537fb3ed949e3f5460891c7f44b525967df68b926ba2a28d4e1cd4b93d5f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:57 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.
Dec 11 04:47:57 np0005555140 podman[199289]: 2025-12-11 09:47:57.182312379 +0000 UTC m=+0.351935373 container init 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.195Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.195Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.195Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.196Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.196Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.196Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.196Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.196Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=arp
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=bcache
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=bonding
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=cpu
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=edac
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=filefd
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=netclass
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=netdev
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=netstat
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=nfs
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=nvme
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=softnet
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=systemd
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=xfs
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.197Z caller=node_exporter.go:117 level=info collector=zfs
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.198Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 11 04:47:57 np0005555140 node_exporter[199305]: ts=2025-12-11T09:47:57.198Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 11 04:47:57 np0005555140 podman[199289]: 2025-12-11 09:47:57.206964335 +0000 UTC m=+0.376587309 container start 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:47:57 np0005555140 podman[199289]: node_exporter
Dec 11 04:47:57 np0005555140 systemd[1]: Started node_exporter container.
Dec 11 04:47:57 np0005555140 podman[199314]: 2025-12-11 09:47:57.376567389 +0000 UTC m=+0.159680341 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:47:58 np0005555140 python3.9[199491]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:47:58 np0005555140 systemd[1]: Stopping node_exporter container...
Dec 11 04:47:58 np0005555140 systemd[1]: libpod-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.scope: Deactivated successfully.
Dec 11 04:47:58 np0005555140 podman[199495]: 2025-12-11 09:47:58.297434936 +0000 UTC m=+0.160597029 container died 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:47:58 np0005555140 systemd[1]: 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a-73961ae9e09e3b8e.timer: Deactivated successfully.
Dec 11 04:47:58 np0005555140 systemd[1]: Stopped /usr/bin/podman healthcheck run 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.
Dec 11 04:47:58 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a-userdata-shm.mount: Deactivated successfully.
Dec 11 04:47:58 np0005555140 systemd[1]: var-lib-containers-storage-overlay-c8da537fb3ed949e3f5460891c7f44b525967df68b926ba2a28d4e1cd4b93d5f-merged.mount: Deactivated successfully.
Dec 11 04:47:59 np0005555140 podman[199495]: 2025-12-11 09:47:59.14388837 +0000 UTC m=+1.007050463 container cleanup 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:47:59 np0005555140 podman[199495]: node_exporter
Dec 11 04:47:59 np0005555140 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 04:47:59 np0005555140 podman[199524]: node_exporter
Dec 11 04:47:59 np0005555140 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 11 04:47:59 np0005555140 systemd[1]: Stopped node_exporter container.
Dec 11 04:47:59 np0005555140 systemd[1]: Starting node_exporter container...
Dec 11 04:47:59 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:47:59 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8da537fb3ed949e3f5460891c7f44b525967df68b926ba2a28d4e1cd4b93d5f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:59 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8da537fb3ed949e3f5460891c7f44b525967df68b926ba2a28d4e1cd4b93d5f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:47:59 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.
Dec 11 04:48:00 np0005555140 podman[199537]: 2025-12-11 09:48:00.066431884 +0000 UTC m=+0.826407084 container init 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.080Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.080Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.080Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.080Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.080Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=arp
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=bcache
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=bonding
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=cpu
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=edac
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=filefd
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=netclass
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=netdev
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=netstat
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=nfs
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=nvme
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=softnet
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=systemd
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=xfs
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.081Z caller=node_exporter.go:117 level=info collector=zfs
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.082Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 11 04:48:00 np0005555140 node_exporter[199553]: ts=2025-12-11T09:48:00.082Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 11 04:48:00 np0005555140 podman[199537]: 2025-12-11 09:48:00.104253026 +0000 UTC m=+0.864228196 container start 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:48:00 np0005555140 podman[199537]: node_exporter
Dec 11 04:48:00 np0005555140 systemd[1]: Started node_exporter container.
Dec 11 04:48:00 np0005555140 podman[199562]: 2025-12-11 09:48:00.270957567 +0000 UTC m=+0.159214378 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:48:00 np0005555140 python3.9[199737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:48:01 np0005555140 python3.9[199860]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446480.4125855-663-198439022597993/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:48:02 np0005555140 python3.9[200012]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 11 04:48:03 np0005555140 podman[200136]: 2025-12-11 09:48:03.11904422 +0000 UTC m=+0.124192556 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 04:48:03 np0005555140 python3.9[200182]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:48:04 np0005555140 python3[200343]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:48:07 np0005555140 podman[200356]: 2025-12-11 09:48:07.153172538 +0000 UTC m=+2.906647940 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 11 04:48:07 np0005555140 podman[200453]: 2025-12-11 09:48:07.291298921 +0000 UTC m=+0.052867054 container create c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:48:07 np0005555140 podman[200453]: 2025-12-11 09:48:07.263748762 +0000 UTC m=+0.025316895 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 11 04:48:07 np0005555140 python3[200343]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 11 04:48:08 np0005555140 python3.9[200639]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:48:08 np0005555140 python3.9[200793]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:09 np0005555140 python3.9[200944]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446488.8362737-716-71880257844553/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:10 np0005555140 python3.9[201020]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:48:10 np0005555140 systemd[1]: Reloading.
Dec 11 04:48:10 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:48:10 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:48:11 np0005555140 python3.9[201132]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:48:11 np0005555140 systemd[1]: Reloading.
Dec 11 04:48:11 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:48:11 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:48:11 np0005555140 systemd[1]: Starting podman_exporter container...
Dec 11 04:48:11 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:48:11 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28431659bbf2fac75a62dfeff7b60dd9e7f8d80562fd6a2de84f441df3c0e0af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:11 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28431659bbf2fac75a62dfeff7b60dd9e7f8d80562fd6a2de84f441df3c0e0af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:11 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.
Dec 11 04:48:11 np0005555140 podman[201172]: 2025-12-11 09:48:11.63121887 +0000 UTC m=+0.145036782 container init c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.650Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.650Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.650Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.650Z caller=handler.go:105 level=info collector=container
Dec 11 04:48:11 np0005555140 podman[201172]: 2025-12-11 09:48:11.665247594 +0000 UTC m=+0.179065466 container start c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:48:11 np0005555140 systemd[1]: Starting Podman API Service...
Dec 11 04:48:11 np0005555140 systemd[1]: Started Podman API Service.
Dec 11 04:48:11 np0005555140 podman[201172]: podman_exporter
Dec 11 04:48:11 np0005555140 systemd[1]: Started podman_exporter container.
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="Setting parallel job count to 25"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="Using sqlite as database backend"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 11 04:48:11 np0005555140 podman[201198]: @ - - [11/Dec/2025:09:48:11 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 11 04:48:11 np0005555140 podman[201198]: time="2025-12-11T09:48:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 11 04:48:11 np0005555140 podman[201196]: 2025-12-11 09:48:11.817411419 +0000 UTC m=+0.141755949 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:48:11 np0005555140 podman[201198]: @ - - [11/Dec/2025:09:48:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20486 "" "Go-http-client/1.1"
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.821Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.822Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 11 04:48:11 np0005555140 podman_exporter[201187]: ts=2025-12-11T09:48:11.822Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 11 04:48:11 np0005555140 systemd[1]: c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e-4beefd03446b063.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:48:11 np0005555140 systemd[1]: c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e-4beefd03446b063.service: Failed with result 'exit-code'.
Dec 11 04:48:12 np0005555140 python3.9[201382]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:48:12 np0005555140 systemd[1]: Stopping podman_exporter container...
Dec 11 04:48:12 np0005555140 podman[201198]: @ - - [11/Dec/2025:09:48:11 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Dec 11 04:48:12 np0005555140 systemd[1]: libpod-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.scope: Deactivated successfully.
Dec 11 04:48:12 np0005555140 podman[201386]: 2025-12-11 09:48:12.596296131 +0000 UTC m=+0.045713190 container died c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:48:12 np0005555140 systemd[1]: c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e-4beefd03446b063.timer: Deactivated successfully.
Dec 11 04:48:12 np0005555140 systemd[1]: Stopped /usr/bin/podman healthcheck run c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.
Dec 11 04:48:12 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e-userdata-shm.mount: Deactivated successfully.
Dec 11 04:48:12 np0005555140 systemd[1]: var-lib-containers-storage-overlay-28431659bbf2fac75a62dfeff7b60dd9e7f8d80562fd6a2de84f441df3c0e0af-merged.mount: Deactivated successfully.
Dec 11 04:48:12 np0005555140 podman[201386]: 2025-12-11 09:48:12.950791577 +0000 UTC m=+0.400208646 container cleanup c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:48:12 np0005555140 podman[201386]: podman_exporter
Dec 11 04:48:12 np0005555140 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 04:48:13 np0005555140 podman[201414]: podman_exporter
Dec 11 04:48:13 np0005555140 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 11 04:48:13 np0005555140 systemd[1]: Stopped podman_exporter container.
Dec 11 04:48:13 np0005555140 systemd[1]: Starting podman_exporter container...
Dec 11 04:48:13 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:48:13 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28431659bbf2fac75a62dfeff7b60dd9e7f8d80562fd6a2de84f441df3c0e0af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:13 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28431659bbf2fac75a62dfeff7b60dd9e7f8d80562fd6a2de84f441df3c0e0af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:13 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.
Dec 11 04:48:13 np0005555140 podman[201428]: 2025-12-11 09:48:13.126989049 +0000 UTC m=+0.099145758 container init c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:48:13 np0005555140 podman[201428]: 2025-12-11 09:48:13.152632343 +0000 UTC m=+0.124789042 container start c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.152Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.152Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.152Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.152Z caller=handler.go:105 level=info collector=container
Dec 11 04:48:13 np0005555140 podman[201198]: @ - - [11/Dec/2025:09:48:13 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 11 04:48:13 np0005555140 podman[201198]: time="2025-12-11T09:48:13Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 11 04:48:13 np0005555140 podman[201428]: podman_exporter
Dec 11 04:48:13 np0005555140 systemd[1]: Started podman_exporter container.
Dec 11 04:48:13 np0005555140 podman[201198]: @ - - [11/Dec/2025:09:48:13 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20484 "" "Go-http-client/1.1"
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.171Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.171Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 11 04:48:13 np0005555140 podman_exporter[201444]: ts=2025-12-11T09:48:13.172Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 11 04:48:13 np0005555140 podman[201449]: 2025-12-11 09:48:13.221742071 +0000 UTC m=+0.056594751 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:48:13 np0005555140 python3.9[201633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:48:14 np0005555140 python3.9[201756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765446493.371156-748-133739618183602/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 04:48:15 np0005555140 python3.9[201908]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 11 04:48:15 np0005555140 python3.9[202060]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 04:48:16 np0005555140 python3[202212]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 04:48:19 np0005555140 podman[202252]: 2025-12-11 09:48:19.70210701 +0000 UTC m=+0.059086482 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 04:48:19 np0005555140 podman[202251]: 2025-12-11 09:48:19.711790608 +0000 UTC m=+0.075622466 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:48:21 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-67286584d46ffac0.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 04:48:21 np0005555140 systemd[1]: cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29-67286584d46ffac0.service: Failed with result 'exit-code'.
Dec 11 04:48:22 np0005555140 podman[202223]: 2025-12-11 09:48:22.874261709 +0000 UTC m=+6.052030622 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 11 04:48:23 np0005555140 podman[202359]: 2025-12-11 09:48:22.995806967 +0000 UTC m=+0.026414987 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 11 04:48:23 np0005555140 podman[202359]: 2025-12-11 09:48:23.377623215 +0000 UTC m=+0.408231225 container create 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6)
Dec 11 04:48:23 np0005555140 python3[202212]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 11 04:48:24 np0005555140 auditd[702]: Audit daemon rotating log files
Dec 11 04:48:24 np0005555140 python3.9[202550]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:48:25 np0005555140 python3.9[202704]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:25 np0005555140 podman[202803]: 2025-12-11 09:48:25.686910578 +0000 UTC m=+0.054671446 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:48:25 np0005555140 python3.9[202875]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765446505.2775588-801-158361199907345/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:26 np0005555140 python3.9[202951]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 04:48:26 np0005555140 systemd[1]: Reloading.
Dec 11 04:48:26 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:48:26 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:48:27 np0005555140 python3.9[203061]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 04:48:27 np0005555140 systemd[1]: Reloading.
Dec 11 04:48:27 np0005555140 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 04:48:27 np0005555140 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 04:48:28 np0005555140 systemd[1]: Starting openstack_network_exporter container...
Dec 11 04:48:28 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:48:28 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:28 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:28 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:29 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.
Dec 11 04:48:29 np0005555140 podman[203101]: 2025-12-11 09:48:29.460618071 +0000 UTC m=+0.859973523 container init 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *bridge.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *coverage.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *datapath.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *iface.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *memory.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *ovnnorthd.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *ovn.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *ovsdbserver.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *pmd_perf.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *pmd_rxq.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: INFO    09:48:29 main.go:48: registering *vswitch.Collector
Dec 11 04:48:29 np0005555140 openstack_network_exporter[203116]: NOTICE  09:48:29 main.go:76: listening on https://:9105/metrics
Dec 11 04:48:29 np0005555140 podman[203101]: 2025-12-11 09:48:29.494879921 +0000 UTC m=+0.894235353 container start 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 04:48:29 np0005555140 podman[203101]: openstack_network_exporter
Dec 11 04:48:29 np0005555140 systemd[1]: Started openstack_network_exporter container.
Dec 11 04:48:29 np0005555140 podman[203126]: 2025-12-11 09:48:29.702077752 +0000 UTC m=+0.196386172 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 04:48:30 np0005555140 python3.9[203300]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 04:48:30 np0005555140 podman[203302]: 2025-12-11 09:48:30.580062 +0000 UTC m=+0.068896943 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 04:48:31 np0005555140 systemd[1]: Stopping openstack_network_exporter container...
Dec 11 04:48:31 np0005555140 systemd[1]: libpod-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.scope: Deactivated successfully.
Dec 11 04:48:31 np0005555140 podman[203326]: 2025-12-11 09:48:31.618958834 +0000 UTC m=+0.069930073 container died 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 04:48:31 np0005555140 systemd[1]: 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d-24f91f80025e8a96.timer: Deactivated successfully.
Dec 11 04:48:31 np0005555140 systemd[1]: Stopped /usr/bin/podman healthcheck run 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.
Dec 11 04:48:31 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d-userdata-shm.mount: Deactivated successfully.
Dec 11 04:48:31 np0005555140 systemd[1]: var-lib-containers-storage-overlay-8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd-merged.mount: Deactivated successfully.
Dec 11 04:48:33 np0005555140 podman[203326]: 2025-12-11 09:48:33.42729518 +0000 UTC m=+1.878266419 container cleanup 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 04:48:33 np0005555140 podman[203326]: openstack_network_exporter
Dec 11 04:48:33 np0005555140 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 04:48:33 np0005555140 podman[203354]: openstack_network_exporter
Dec 11 04:48:33 np0005555140 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 11 04:48:33 np0005555140 systemd[1]: Stopped openstack_network_exporter container.
Dec 11 04:48:33 np0005555140 systemd[1]: Starting openstack_network_exporter container...
Dec 11 04:48:33 np0005555140 podman[203353]: 2025-12-11 09:48:33.578164044 +0000 UTC m=+0.119947321 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:48:33 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:48:33 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:33 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:33 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8252f90c02e0e53e02a59b4c19f75c3ba048d58d52a623e13dc90721565fbdfd/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 04:48:33 np0005555140 systemd[1]: Started /usr/bin/podman healthcheck run 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.
Dec 11 04:48:33 np0005555140 podman[203381]: 2025-12-11 09:48:33.684693566 +0000 UTC m=+0.172657006 container init 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *bridge.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *coverage.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *datapath.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *iface.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *memory.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *ovnnorthd.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *ovn.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *ovsdbserver.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *pmd_perf.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *pmd_rxq.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: INFO    09:48:33 main.go:48: registering *vswitch.Collector
Dec 11 04:48:33 np0005555140 openstack_network_exporter[203404]: NOTICE  09:48:33 main.go:76: listening on https://:9105/metrics
Dec 11 04:48:33 np0005555140 podman[203381]: 2025-12-11 09:48:33.720547473 +0000 UTC m=+0.208510893 container start 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Dec 11 04:48:33 np0005555140 podman[203381]: openstack_network_exporter
Dec 11 04:48:33 np0005555140 systemd[1]: Started openstack_network_exporter container.
Dec 11 04:48:33 np0005555140 podman[203414]: 2025-12-11 09:48:33.83071405 +0000 UTC m=+0.095304398 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 11 04:48:34 np0005555140 python3.9[203584]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 04:48:35 np0005555140 python3.9[203736]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 11 04:48:36 np0005555140 python3.9[203901]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:36 np0005555140 systemd[1]: Started libpod-conmon-7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659.scope.
Dec 11 04:48:36 np0005555140 podman[203902]: 2025-12-11 09:48:36.918192725 +0000 UTC m=+0.392907487 container exec 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 11 04:48:37 np0005555140 podman[203902]: 2025-12-11 09:48:37.244262017 +0000 UTC m=+0.718976799 container exec_died 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:48:37 np0005555140 systemd[1]: libpod-conmon-7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659.scope: Deactivated successfully.
Dec 11 04:48:38 np0005555140 python3.9[204086]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:38 np0005555140 systemd[1]: Started libpod-conmon-7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659.scope.
Dec 11 04:48:38 np0005555140 podman[204087]: 2025-12-11 09:48:38.13032253 +0000 UTC m=+0.089557512 container exec 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:48:38 np0005555140 podman[204107]: 2025-12-11 09:48:38.238093867 +0000 UTC m=+0.096102861 container exec_died 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 04:48:38 np0005555140 podman[204087]: 2025-12-11 09:48:38.245560564 +0000 UTC m=+0.204795566 container exec_died 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 11 04:48:38 np0005555140 systemd[1]: libpod-conmon-7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659.scope: Deactivated successfully.
Dec 11 04:48:38 np0005555140 python3.9[204272]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:39 np0005555140 python3.9[204424]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 11 04:48:40 np0005555140 python3.9[204589]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:40 np0005555140 systemd[1]: Started libpod-conmon-9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483.scope.
Dec 11 04:48:40 np0005555140 podman[204590]: 2025-12-11 09:48:40.493318737 +0000 UTC m=+0.073970961 container exec 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 04:48:40 np0005555140 podman[204590]: 2025-12-11 09:48:40.527243338 +0000 UTC m=+0.107895552 container exec_died 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:48:40 np0005555140 systemd[1]: libpod-conmon-9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483.scope: Deactivated successfully.
Dec 11 04:48:41 np0005555140 python3.9[204774]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:41 np0005555140 systemd[1]: Started libpod-conmon-9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483.scope.
Dec 11 04:48:41 np0005555140 podman[204775]: 2025-12-11 09:48:41.572097434 +0000 UTC m=+0.317301650 container exec 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 04:48:41 np0005555140 podman[204775]: 2025-12-11 09:48:41.825365211 +0000 UTC m=+0.570569307 container exec_died 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:48:42 np0005555140 systemd[1]: libpod-conmon-9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483.scope: Deactivated successfully.
Dec 11 04:48:42 np0005555140 python3.9[204960]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:43 np0005555140 podman[205084]: 2025-12-11 09:48:43.505355641 +0000 UTC m=+0.059106241 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:48:43 np0005555140 python3.9[205134]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 11 04:48:44 np0005555140 python3.9[205299]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:44 np0005555140 systemd[1]: Started libpod-conmon-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.scope.
Dec 11 04:48:44 np0005555140 podman[205300]: 2025-12-11 09:48:44.634230287 +0000 UTC m=+0.115897294 container exec 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:48:44 np0005555140 podman[205300]: 2025-12-11 09:48:44.671404442 +0000 UTC m=+0.153071429 container exec_died 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:48:44 np0005555140 systemd[1]: libpod-conmon-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.scope: Deactivated successfully.
Dec 11 04:48:45 np0005555140 python3.9[205483]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:45 np0005555140 systemd[1]: Started libpod-conmon-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.scope.
Dec 11 04:48:45 np0005555140 podman[205484]: 2025-12-11 09:48:45.467925714 +0000 UTC m=+0.085098573 container exec 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 11 04:48:45 np0005555140 podman[205484]: 2025-12-11 09:48:45.501074483 +0000 UTC m=+0.118247322 container exec_died 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 11 04:48:45 np0005555140 systemd[1]: libpod-conmon-8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248.scope: Deactivated successfully.
Dec 11 04:48:46 np0005555140 python3.9[205664]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.330 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.372 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.373 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.373 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.373 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:48:46 np0005555140 nova_compute[187006]: 2025-12-11 09:48:46.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 python3.9[205816]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 11 04:48:47 np0005555140 python3.9[205981]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.827 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.845 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.846 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:48:47 np0005555140 nova_compute[187006]: 2025-12-11 09:48:47.873 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:48:47 np0005555140 systemd[1]: Started libpod-conmon-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope.
Dec 11 04:48:47 np0005555140 podman[205982]: 2025-12-11 09:48:47.892677028 +0000 UTC m=+0.077087541 container exec cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Dec 11 04:48:47 np0005555140 podman[205982]: 2025-12-11 09:48:47.922585503 +0000 UTC m=+0.106995976 container exec_died cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:48:47 np0005555140 systemd[1]: libpod-conmon-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope: Deactivated successfully.
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.062 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.063 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.36714935302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.063 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.064 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.136 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.137 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.159 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.174 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.176 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:48:48 np0005555140 nova_compute[187006]: 2025-12-11 09:48:48.177 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:48:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:48:48.613 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:48:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:48:48.614 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:48:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:48:48.614 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:48:48 np0005555140 python3.9[206162]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:48 np0005555140 systemd[1]: Started libpod-conmon-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope.
Dec 11 04:48:48 np0005555140 podman[206163]: 2025-12-11 09:48:48.769260846 +0000 UTC m=+0.075543616 container exec cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 04:48:48 np0005555140 podman[206163]: 2025-12-11 09:48:48.80224264 +0000 UTC m=+0.108525400 container exec_died cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 04:48:48 np0005555140 systemd[1]: libpod-conmon-cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29.scope: Deactivated successfully.
Dec 11 04:48:49 np0005555140 python3.9[206346]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:50 np0005555140 python3.9[206498]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 11 04:48:51 np0005555140 python3.9[206663]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:51 np0005555140 systemd[1]: Started libpod-conmon-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.scope.
Dec 11 04:48:51 np0005555140 podman[206664]: 2025-12-11 09:48:51.207454059 +0000 UTC m=+0.081972073 container exec 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:48:51 np0005555140 podman[206683]: 2025-12-11 09:48:51.273150379 +0000 UTC m=+0.052927902 container exec_died 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:48:51 np0005555140 podman[206664]: 2025-12-11 09:48:51.281102289 +0000 UTC m=+0.155620323 container exec_died 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:48:51 np0005555140 systemd[1]: libpod-conmon-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.scope: Deactivated successfully.
Dec 11 04:48:51 np0005555140 podman[206796]: 2025-12-11 09:48:51.711913852 +0000 UTC m=+0.077356399 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS)
Dec 11 04:48:51 np0005555140 podman[206792]: 2025-12-11 09:48:51.719174242 +0000 UTC m=+0.086392130 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:48:52 np0005555140 python3.9[206881]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:52 np0005555140 systemd[1]: Started libpod-conmon-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.scope.
Dec 11 04:48:52 np0005555140 podman[206882]: 2025-12-11 09:48:52.183742201 +0000 UTC m=+0.091475857 container exec 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:48:52 np0005555140 podman[206882]: 2025-12-11 09:48:52.22035223 +0000 UTC m=+0.128085856 container exec_died 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:48:52 np0005555140 systemd[1]: libpod-conmon-3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a.scope: Deactivated successfully.
Dec 11 04:48:52 np0005555140 python3.9[207064]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:53 np0005555140 python3.9[207216]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 11 04:48:54 np0005555140 python3.9[207381]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:54 np0005555140 systemd[1]: Started libpod-conmon-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.scope.
Dec 11 04:48:54 np0005555140 podman[207382]: 2025-12-11 09:48:54.514523367 +0000 UTC m=+0.083408494 container exec c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:48:54 np0005555140 podman[207382]: 2025-12-11 09:48:54.550514498 +0000 UTC m=+0.119399625 container exec_died c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:48:54 np0005555140 systemd[1]: libpod-conmon-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.scope: Deactivated successfully.
Dec 11 04:48:55 np0005555140 python3.9[207565]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:55 np0005555140 systemd[1]: Started libpod-conmon-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.scope.
Dec 11 04:48:55 np0005555140 podman[207566]: 2025-12-11 09:48:55.303067997 +0000 UTC m=+0.072392165 container exec c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 04:48:55 np0005555140 podman[207566]: 2025-12-11 09:48:55.344329951 +0000 UTC m=+0.113654089 container exec_died c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:48:55 np0005555140 systemd[1]: libpod-conmon-c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e.scope: Deactivated successfully.
Dec 11 04:48:55 np0005555140 podman[207719]: 2025-12-11 09:48:55.856084384 +0000 UTC m=+0.059040889 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 04:48:56 np0005555140 python3.9[207766]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:56 np0005555140 python3.9[207918]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 11 04:48:57 np0005555140 python3.9[208083]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:57 np0005555140 systemd[1]: Started libpod-conmon-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.scope.
Dec 11 04:48:57 np0005555140 podman[208084]: 2025-12-11 09:48:57.583119213 +0000 UTC m=+0.087941015 container exec 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 11 04:48:57 np0005555140 podman[208084]: 2025-12-11 09:48:57.614766138 +0000 UTC m=+0.119587920 container exec_died 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git)
Dec 11 04:48:57 np0005555140 systemd[1]: libpod-conmon-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.scope: Deactivated successfully.
Dec 11 04:48:58 np0005555140 python3.9[208266]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 04:48:58 np0005555140 systemd[1]: Started libpod-conmon-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.scope.
Dec 11 04:48:58 np0005555140 podman[208267]: 2025-12-11 09:48:58.450978648 +0000 UTC m=+0.089749488 container exec 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 04:48:58 np0005555140 podman[208267]: 2025-12-11 09:48:58.488429411 +0000 UTC m=+0.127200261 container exec_died 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 11 04:48:58 np0005555140 systemd[1]: libpod-conmon-9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d.scope: Deactivated successfully.
Dec 11 04:48:59 np0005555140 python3.9[208450]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:48:59 np0005555140 python3.9[208602]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:00 np0005555140 python3.9[208754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:00 np0005555140 podman[208779]: 2025-12-11 09:49:00.687293811 +0000 UTC m=+0.054530099 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:49:01 np0005555140 python3.9[208901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765446540.044746-1082-33658528558535/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:01 np0005555140 python3.9[209053]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:02 np0005555140 python3.9[209207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:02 np0005555140 python3.9[209285]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:03 np0005555140 python3.9[209437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:03 np0005555140 podman[209440]: 2025-12-11 09:49:03.744971664 +0000 UTC m=+0.113861075 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Dec 11 04:49:03 np0005555140 podman[209541]: 2025-12-11 09:49:03.972049383 +0000 UTC m=+0.080900801 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Dec 11 04:49:04 np0005555140 python3.9[209542]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zod70wel recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:04 np0005555140 python3.9[209714]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:05 np0005555140 python3.9[209792]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:05 np0005555140 python3.9[209944]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:49:06 np0005555140 python3[210097]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 04:49:07 np0005555140 python3.9[210249]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:07 np0005555140 python3.9[210327]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:08 np0005555140 python3.9[210479]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:09 np0005555140 python3.9[210557]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:10 np0005555140 python3.9[210709]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:10 np0005555140 python3.9[210787]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:11 np0005555140 python3.9[210939]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:11 np0005555140 python3.9[211017]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:12 np0005555140 python3.9[211169]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 04:49:13 np0005555140 python3.9[211294]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765446551.903224-1207-52350035903103/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:13 np0005555140 podman[211446]: 2025-12-11 09:49:13.604193583 +0000 UTC m=+0.049912754 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 04:49:13 np0005555140 python3.9[211447]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:14 np0005555140 python3.9[211624]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:49:15 np0005555140 python3.9[211779]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:15 np0005555140 python3.9[211931]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:49:16 np0005555140 python3.9[212084]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 04:49:17 np0005555140 python3.9[212238]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 04:49:18 np0005555140 python3.9[212393]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 04:49:18 np0005555140 systemd[1]: session-25.scope: Deactivated successfully.
Dec 11 04:49:18 np0005555140 systemd[1]: session-25.scope: Consumed 1min 39.213s CPU time.
Dec 11 04:49:18 np0005555140 systemd-logind[787]: Session 25 logged out. Waiting for processes to exit.
Dec 11 04:49:18 np0005555140 systemd-logind[787]: Removed session 25.
Dec 11 04:49:21 np0005555140 podman[212421]: 2025-12-11 09:49:21.859896567 +0000 UTC m=+0.090073087 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:49:21 np0005555140 podman[212420]: 2025-12-11 09:49:21.86106495 +0000 UTC m=+0.092778465 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 11 04:49:26 np0005555140 podman[212457]: 2025-12-11 09:49:26.715711656 +0000 UTC m=+0.093551277 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:49:31 np0005555140 podman[212478]: 2025-12-11 09:49:31.688286444 +0000 UTC m=+0.053090787 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:49:34 np0005555140 podman[212503]: 2025-12-11 09:49:34.689773792 +0000 UTC m=+0.062191920 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, distribution-scope=public)
Dec 11 04:49:34 np0005555140 podman[212502]: 2025-12-11 09:49:34.724240049 +0000 UTC m=+0.096507053 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 11 04:49:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:43.796 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:49:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:43.797 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:49:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:43.798 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:49:44 np0005555140 podman[212551]: 2025-12-11 09:49:44.680785494 +0000 UTC m=+0.051179150 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:49:46 np0005555140 nova_compute[187006]: 2025-12-11 09:49:46.159 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:46 np0005555140 nova_compute[187006]: 2025-12-11 09:49:46.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:46 np0005555140 nova_compute[187006]: 2025-12-11 09:49:46.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:46 np0005555140 nova_compute[187006]: 2025-12-11 09:49:46.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:49:47 np0005555140 nova_compute[187006]: 2025-12-11 09:49:47.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:47 np0005555140 nova_compute[187006]: 2025-12-11 09:49:47.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:49:47 np0005555140 nova_compute[187006]: 2025-12-11 09:49:47.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:49:47 np0005555140 nova_compute[187006]: 2025-12-11 09:49:47.843 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:49:47 np0005555140 nova_compute[187006]: 2025-12-11 09:49:47.843 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:48.614 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:49:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:49:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:49:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.856 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.856 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.857 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:49:48 np0005555140 nova_compute[187006]: 2025-12-11 09:49:48.857 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.018 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.019 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5936MB free_disk=73.3669319152832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.019 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.019 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.201 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.201 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.220 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.234 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.236 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:49:49 np0005555140 nova_compute[187006]: 2025-12-11 09:49:49.236 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:49:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:49:50 np0005555140 nova_compute[187006]: 2025-12-11 09:49:50.232 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:50 np0005555140 nova_compute[187006]: 2025-12-11 09:49:50.233 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:49:52 np0005555140 podman[212576]: 2025-12-11 09:49:52.689414002 +0000 UTC m=+0.053283881 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 04:49:52 np0005555140 podman[212575]: 2025-12-11 09:49:52.690670448 +0000 UTC m=+0.058207673 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 04:49:57 np0005555140 podman[212614]: 2025-12-11 09:49:57.671814954 +0000 UTC m=+0.052211760 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 04:50:02 np0005555140 podman[212633]: 2025-12-11 09:50:02.687673621 +0000 UTC m=+0.053172608 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:50:05 np0005555140 podman[212658]: 2025-12-11 09:50:05.688841973 +0000 UTC m=+0.058325046 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350)
Dec 11 04:50:05 np0005555140 podman[212657]: 2025-12-11 09:50:05.696706141 +0000 UTC m=+0.075639047 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Dec 11 04:50:15 np0005555140 podman[212702]: 2025-12-11 09:50:15.689133683 +0000 UTC m=+0.056458932 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:50:23 np0005555140 podman[212727]: 2025-12-11 09:50:23.679345949 +0000 UTC m=+0.053293621 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:50:23 np0005555140 podman[212726]: 2025-12-11 09:50:23.711044295 +0000 UTC m=+0.085280225 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 11 04:50:28 np0005555140 podman[212765]: 2025-12-11 09:50:28.673118958 +0000 UTC m=+0.050687205 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 04:50:33 np0005555140 podman[212784]: 2025-12-11 09:50:33.680576044 +0000 UTC m=+0.052242400 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 04:50:36 np0005555140 podman[212811]: 2025-12-11 09:50:36.723849892 +0000 UTC m=+0.088448527 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec 11 04:50:36 np0005555140 podman[212810]: 2025-12-11 09:50:36.724027587 +0000 UTC m=+0.090019332 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:50:46 np0005555140 podman[212855]: 2025-12-11 09:50:46.674948828 +0000 UTC m=+0.045155090 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:50:47 np0005555140 nova_compute[187006]: 2025-12-11 09:50:47.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:47 np0005555140 nova_compute[187006]: 2025-12-11 09:50:47.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:50:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:50:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:50:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:50:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:50:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.852 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.853 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.853 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.854 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.854 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:48 np0005555140 nova_compute[187006]: 2025-12-11 09:50:48.854 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:50:49 np0005555140 nova_compute[187006]: 2025-12-11 09:50:49.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:49 np0005555140 nova_compute[187006]: 2025-12-11 09:50:49.861 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:50:49 np0005555140 nova_compute[187006]: 2025-12-11 09:50:49.861 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:50:49 np0005555140 nova_compute[187006]: 2025-12-11 09:50:49.861 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:50:49 np0005555140 nova_compute[187006]: 2025-12-11 09:50:49.862 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.008 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.009 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6026MB free_disk=73.3669319152832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.009 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.009 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.069 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.070 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.095 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.108 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.109 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:50:50 np0005555140 nova_compute[187006]: 2025-12-11 09:50:50.109 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:50:51 np0005555140 nova_compute[187006]: 2025-12-11 09:50:51.104 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:51 np0005555140 nova_compute[187006]: 2025-12-11 09:50:51.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:50:54 np0005555140 podman[212876]: 2025-12-11 09:50:54.682734714 +0000 UTC m=+0.055148348 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 04:50:54 np0005555140 podman[212877]: 2025-12-11 09:50:54.694884413 +0000 UTC m=+0.062259822 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 04:50:59 np0005555140 podman[212916]: 2025-12-11 09:50:59.712738025 +0000 UTC m=+0.072177278 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:51:04 np0005555140 podman[212935]: 2025-12-11 09:51:04.706422202 +0000 UTC m=+0.077862851 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:51:07 np0005555140 podman[212962]: 2025-12-11 09:51:07.69166712 +0000 UTC m=+0.054185270 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 04:51:07 np0005555140 podman[212961]: 2025-12-11 09:51:07.737156809 +0000 UTC m=+0.108601585 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202)
Dec 11 04:51:17 np0005555140 podman[213008]: 2025-12-11 09:51:17.682966737 +0000 UTC m=+0.059556005 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:51:25 np0005555140 podman[213033]: 2025-12-11 09:51:25.694663415 +0000 UTC m=+0.067258595 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 11 04:51:25 np0005555140 podman[213034]: 2025-12-11 09:51:25.699810323 +0000 UTC m=+0.065980059 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:51:30 np0005555140 podman[213074]: 2025-12-11 09:51:30.688766605 +0000 UTC m=+0.058626928 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec 11 04:51:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:30.761 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:51:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:30.763 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:51:34 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:34.764 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:51:35 np0005555140 podman[213097]: 2025-12-11 09:51:35.680246379 +0000 UTC m=+0.058458013 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 04:51:38 np0005555140 podman[213124]: 2025-12-11 09:51:38.696165792 +0000 UTC m=+0.065265899 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec 11 04:51:38 np0005555140 podman[213123]: 2025-12-11 09:51:38.766229878 +0000 UTC m=+0.134587693 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.859 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.860 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.860 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 04:51:45 np0005555140 nova_compute[187006]: 2025-12-11 09:51:45.877 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:48.615 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:48.616 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:51:48.616 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:48 np0005555140 podman[213169]: 2025-12-11 09:51:48.678062751 +0000 UTC m=+0.053327526 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.929 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.929 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.929 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.952 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.953 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.953 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.953 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:48 np0005555140 nova_compute[187006]: 2025-12-11 09:51:48.953 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:51:49 np0005555140 nova_compute[187006]: 2025-12-11 09:51:49.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:49 np0005555140 nova_compute[187006]: 2025-12-11 09:51:49.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:51:50.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 04:51:50 np0005555140 nova_compute[187006]: 2025-12-11 09:51:50.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.563 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.563 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.563 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.563 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.724 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.725 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6062MB free_disk=73.36695098876953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.726 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.726 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.855 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.855 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.908 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing inventories for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.965 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating ProviderTree inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.965 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:51:51 np0005555140 nova_compute[187006]: 2025-12-11 09:51:51.979 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing aggregate associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 04:51:52 np0005555140 nova_compute[187006]: 2025-12-11 09:51:52.001 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing trait associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,HW_CPU_X86_SVM,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 04:51:52 np0005555140 nova_compute[187006]: 2025-12-11 09:51:52.021 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:51:52 np0005555140 nova_compute[187006]: 2025-12-11 09:51:52.046 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:51:52 np0005555140 nova_compute[187006]: 2025-12-11 09:51:52.048 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:51:52 np0005555140 nova_compute[187006]: 2025-12-11 09:51:52.048 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.044 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.045 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.277 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.277 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.310 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.409 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.409 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.417 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.418 187010 INFO nova.compute.claims [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.529 187010 DEBUG nova.compute.provider_tree [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.545 187010 DEBUG nova.scheduler.client.report [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.570 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.571 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.620 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.621 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.646 187010 INFO nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.663 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.845 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.846 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.846 187010 INFO nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Creating image(s)#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.847 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.847 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.848 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.849 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:53 np0005555140 nova_compute[187006]: 2025-12-11 09:51:53.849 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:54 np0005555140 nova_compute[187006]: 2025-12-11 09:51:54.576 187010 WARNING oslo_policy.policy [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec 11 04:51:54 np0005555140 nova_compute[187006]: 2025-12-11 09:51:54.577 187010 WARNING oslo_policy.policy [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec 11 04:51:54 np0005555140 nova_compute[187006]: 2025-12-11 09:51:54.579 187010 DEBUG nova.policy [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:51:55 np0005555140 nova_compute[187006]: 2025-12-11 09:51:55.714 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:55 np0005555140 nova_compute[187006]: 2025-12-11 09:51:55.804 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.part --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:55 np0005555140 nova_compute[187006]: 2025-12-11 09:51:55.805 187010 DEBUG nova.virt.images [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] 9e66a2ab-a034-4869-91a9-a90f37915272 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec 11 04:51:55 np0005555140 nova_compute[187006]: 2025-12-11 09:51:55.806 187010 DEBUG nova.privsep.utils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec 11 04:51:55 np0005555140 nova_compute[187006]: 2025-12-11 09:51:55.806 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.part /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.065 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Successfully created port: 98230026-4a3b-410e-9e9e-9a7b06879680 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.073 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.part /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.converted" returned: 0 in 0.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.077 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.144 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b.converted --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.146 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.171 187010 INFO oslo.privsep.daemon [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp5pto2nm4/privsep.sock']#033[00m
Dec 11 04:51:56 np0005555140 podman[213212]: 2025-12-11 09:51:56.68343121 +0000 UTC m=+0.055503370 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:51:56 np0005555140 podman[213213]: 2025-12-11 09:51:56.684256333 +0000 UTC m=+0.056262061 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.893 187010 INFO oslo.privsep.daemon [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.745 213253 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.749 213253 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.751 213253 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.751 213253 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213253#033[00m
Dec 11 04:51:56 np0005555140 nova_compute[187006]: 2025-12-11 09:51:56.984 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.068 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.069 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.070 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.081 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.134 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.135 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.264 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk 1073741824" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.265 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.266 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.358 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.359 187010 DEBUG nova.virt.disk.api [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.360 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.433 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.434 187010 DEBUG nova.virt.disk.api [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.434 187010 DEBUG nova.objects.instance [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 268e9416-7794-482e-8290-95afc792d4c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.458 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.459 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Ensure instance console log exists: /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.459 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.460 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.460 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.733 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Successfully updated port: 98230026-4a3b-410e-9e9e-9a7b06879680 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.751 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.751 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:51:57 np0005555140 nova_compute[187006]: 2025-12-11 09:51:57.752 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.245 187010 DEBUG nova.compute.manager [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.245 187010 DEBUG nova.compute.manager [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing instance network info cache due to event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.246 187010 DEBUG oslo_concurrency.lockutils [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.248 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.889 187010 DEBUG nova.network.neutron [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.912 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.912 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Instance network_info: |[{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.912 187010 DEBUG oslo_concurrency.lockutils [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.913 187010 DEBUG nova.network.neutron [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.916 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Start _get_guest_xml network_info=[{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.922 187010 WARNING nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.932 187010 DEBUG nova.virt.libvirt.host [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.933 187010 DEBUG nova.virt.libvirt.host [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.937 187010 DEBUG nova.virt.libvirt.host [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.938 187010 DEBUG nova.virt.libvirt.host [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.939 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.940 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.940 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.941 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.941 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.942 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.942 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.942 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.943 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.943 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.944 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.944 187010 DEBUG nova.virt.hardware [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.950 187010 DEBUG nova.privsep.utils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.952 187010 DEBUG nova.virt.libvirt.vif [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1217004400',display_name='tempest-TestNetworkBasicOps-server-1217004400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1217004400',id=1,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGJ6bz+VXXSH81a0VHz7rgBFE29m5WU0Y/vAsmHz3b31tXEehN7HVhY89KfeMsMIWFiwKd0Dv1bEIWabt+cel+h24NvALwfU+elUZgsFh6eWOkAB0Ht7CHhvmXSV8HY79A==',key_name='tempest-TestNetworkBasicOps-1693520131',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rhc7zd77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:51:53Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=268e9416-7794-482e-8290-95afc792d4c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.953 187010 DEBUG nova.network.os_vif_util [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.954 187010 DEBUG nova.network.os_vif_util [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.956 187010 DEBUG nova.objects.instance [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 268e9416-7794-482e-8290-95afc792d4c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.975 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <uuid>268e9416-7794-482e-8290-95afc792d4c1</uuid>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <name>instance-00000001</name>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1217004400</nova:name>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:51:58</nova:creationTime>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        <nova:port uuid="98230026-4a3b-410e-9e9e-9a7b06879680">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="serial">268e9416-7794-482e-8290-95afc792d4c1</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="uuid">268e9416-7794-482e-8290-95afc792d4c1</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.config"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:a8:c8:c2"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <target dev="tap98230026-4a"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/console.log" append="off"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:51:58 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:51:58 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:51:58 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:51:58 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.976 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Preparing to wait for external event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.976 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.976 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.976 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.977 187010 DEBUG nova.virt.libvirt.vif [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1217004400',display_name='tempest-TestNetworkBasicOps-server-1217004400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1217004400',id=1,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGJ6bz+VXXSH81a0VHz7rgBFE29m5WU0Y/vAsmHz3b31tXEehN7HVhY89KfeMsMIWFiwKd0Dv1bEIWabt+cel+h24NvALwfU+elUZgsFh6eWOkAB0Ht7CHhvmXSV8HY79A==',key_name='tempest-TestNetworkBasicOps-1693520131',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rhc7zd77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:51:53Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=268e9416-7794-482e-8290-95afc792d4c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.977 187010 DEBUG nova.network.os_vif_util [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.978 187010 DEBUG nova.network.os_vif_util [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:51:58 np0005555140 nova_compute[187006]: 2025-12-11 09:51:58.978 187010 DEBUG os_vif [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.009 187010 DEBUG ovsdbapp.backend.ovs_idl [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.010 187010 DEBUG ovsdbapp.backend.ovs_idl [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.010 187010 DEBUG ovsdbapp.backend.ovs_idl [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.010 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.011 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.011 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.012 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.013 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.015 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.025 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.025 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.026 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.027 187010 INFO oslo.privsep.daemon [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp49f9to3p/privsep.sock']#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.770 187010 INFO oslo.privsep.daemon [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.616 213274 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.619 213274 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.621 213274 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec 11 04:51:59 np0005555140 nova_compute[187006]: 2025-12-11 09:51:59.621 213274 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213274#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.069 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.070 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98230026-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.070 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap98230026-4a, col_values=(('external_ids', {'iface-id': '98230026-4a3b-410e-9e9e-9a7b06879680', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:c8:c2', 'vm-uuid': '268e9416-7794-482e-8290-95afc792d4c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.073 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:00 np0005555140 NetworkManager[55531]: <info>  [1765446720.0737] manager: (tap98230026-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.076 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.080 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.081 187010 INFO os_vif [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a')#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.130 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.131 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.131 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:a8:c8:c2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:52:00 np0005555140 nova_compute[187006]: 2025-12-11 09:52:00.132 187010 INFO nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Using config drive#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.647 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.662 187010 DEBUG nova.network.neutron [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updated VIF entry in instance network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.663 187010 DEBUG nova.network.neutron [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:52:01 np0005555140 podman[213280]: 2025-12-11 09:52:01.681641775 +0000 UTC m=+0.052511354 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.681 187010 INFO nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Creating config drive at /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.config#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.687 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbc29oz2p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.704 187010 DEBUG oslo_concurrency.lockutils [req-a2e869b8-19d7-4c6e-83de-8e8d590cf5a8 req-0186ef37-6680-4c7d-84f4-7ea0e1131c8e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.812 187010 DEBUG oslo_concurrency.processutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbc29oz2p" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:01 np0005555140 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 11 04:52:01 np0005555140 kernel: tap98230026-4a: entered promiscuous mode
Dec 11 04:52:01 np0005555140 NetworkManager[55531]: <info>  [1765446721.9350] manager: (tap98230026-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Dec 11 04:52:01 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:01Z|00027|binding|INFO|Claiming lport 98230026-4a3b-410e-9e9e-9a7b06879680 for this chassis.
Dec 11 04:52:01 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:01Z|00028|binding|INFO|98230026-4a3b-410e-9e9e-9a7b06879680: Claiming fa:16:3e:a8:c8:c2 10.100.0.7
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.939 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:01 np0005555140 nova_compute[187006]: 2025-12-11 09:52:01.943 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:01 np0005555140 systemd-udevd[213320]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:52:02 np0005555140 NetworkManager[55531]: <info>  [1765446722.0088] device (tap98230026-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:52:02 np0005555140 NetworkManager[55531]: <info>  [1765446722.0100] device (tap98230026-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:52:02 np0005555140 nova_compute[187006]: 2025-12-11 09:52:02.026 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:02 np0005555140 systemd-machined[153398]: New machine qemu-1-instance-00000001.
Dec 11 04:52:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:02Z|00029|binding|INFO|Setting lport 98230026-4a3b-410e-9e9e-9a7b06879680 ovn-installed in OVS
Dec 11 04:52:02 np0005555140 nova_compute[187006]: 2025-12-11 09:52:02.032 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:02 np0005555140 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 11 04:52:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:02Z|00030|binding|INFO|Setting lport 98230026-4a3b-410e-9e9e-9a7b06879680 up in Southbound
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.091 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:c8:c2 10.100.0.7'], port_security=['fa:16:3e:a8:c8:c2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '268e9416-7794-482e-8290-95afc792d4c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd55fe206-2b00-4f4d-9918-4d5d81a7b847', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43167a1-3764-4b58-b022-3678644ed4a5, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=98230026-4a3b-410e-9e9e-9a7b06879680) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.092 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 98230026-4a3b-410e-9e9e-9a7b06879680 in datapath 0042166c-0227-49c9-94ae-99b2fe7aec0a bound to our chassis#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.094 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0042166c-0227-49c9-94ae-99b2fe7aec0a#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.095 104288 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpzzukwpwn/privsep.sock']#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.776 104288 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.777 104288 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzzukwpwn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.644 213337 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.651 213337 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.655 213337 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.655 213337 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213337#033[00m
Dec 11 04:52:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:02.780 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9b941201-f9ce-4abd-9dca-3c563169b19e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.141 187010 DEBUG nova.compute.manager [req-f289fd06-c85f-42b7-bf25-bfc922a1860a req-4afff371-d2cd-4efb-94bc-6ad95d74a27b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.141 187010 DEBUG oslo_concurrency.lockutils [req-f289fd06-c85f-42b7-bf25-bfc922a1860a req-4afff371-d2cd-4efb-94bc-6ad95d74a27b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.142 187010 DEBUG oslo_concurrency.lockutils [req-f289fd06-c85f-42b7-bf25-bfc922a1860a req-4afff371-d2cd-4efb-94bc-6ad95d74a27b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.142 187010 DEBUG oslo_concurrency.lockutils [req-f289fd06-c85f-42b7-bf25-bfc922a1860a req-4afff371-d2cd-4efb-94bc-6ad95d74a27b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.142 187010 DEBUG nova.compute.manager [req-f289fd06-c85f-42b7-bf25-bfc922a1860a req-4afff371-d2cd-4efb-94bc-6ad95d74a27b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Processing event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.169 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446723.1680183, 268e9416-7794-482e-8290-95afc792d4c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.169 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] VM Started (Lifecycle Event)#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.172 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.176 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.180 187010 INFO nova.virt.libvirt.driver [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Instance spawned successfully.#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.180 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.203 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.209 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.211 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.212 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.212 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.212 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.213 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.213 187010 DEBUG nova.virt.libvirt.driver [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.239 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.239 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446723.1683462, 268e9416-7794-482e-8290-95afc792d4c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.240 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.265 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.269 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446723.1755033, 268e9416-7794-482e-8290-95afc792d4c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.270 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.275 187010 INFO nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Took 9.43 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.276 187010 DEBUG nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.280 213337 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.280 213337 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.281 213337 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.285 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.288 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.313 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.347 187010 INFO nova.compute.manager [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Took 9.96 seconds to build instance.#033[00m
Dec 11 04:52:03 np0005555140 nova_compute[187006]: 2025-12-11 09:52:03.364 187010 DEBUG oslo_concurrency.lockutils [None req-ffb342ef-f74b-4562-849f-8bbd77542e32 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.908 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc38865-c501-4847-b713-79fb82e1f824]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.910 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0042166c-01 in ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.912 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0042166c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.912 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f61fa98a-5e32-4220-807c-317dc353d238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.916 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f196b29c-81c7-4a7a-ac68-76e61fe0afeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.942 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[df19866a-9666-41db-8fe8-29717666db75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.964 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[282e4010-3ddb-49f5-a835-3b46aa0eaa36]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:03.967 104288 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpocst_lr8/privsep.sock']#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.619 104288 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.621 104288 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpocst_lr8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.490 213358 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.497 213358 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.501 213358 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.501 213358 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213358#033[00m
Dec 11 04:52:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:04.625 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[f6dcbfe4-d1d3-4e72-952a-ad8f1c17d341]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.072 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.116 213358 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.116 213358 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.116 213358 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.277 187010 DEBUG nova.compute.manager [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.278 187010 DEBUG oslo_concurrency.lockutils [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.278 187010 DEBUG oslo_concurrency.lockutils [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.278 187010 DEBUG oslo_concurrency.lockutils [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.279 187010 DEBUG nova.compute.manager [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] No waiting events found dispatching network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.279 187010 WARNING nova.compute.manager [req-c23d452e-e94a-4f4b-8dc0-97c3bd30c2f2 req-eaa0df3a-b493-492e-8e3f-541d33a43ac6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received unexpected event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.714 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[1968f8d6-766b-40d9-9762-99c756437c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.761 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[08b386f6-f34d-4d31-b3a4-eeda893265fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 NetworkManager[55531]: <info>  [1765446725.7660] manager: (tap0042166c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.789 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[37f68bc1-979f-4a0f-9933-b39a14b0da79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.792 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba6a391-325d-48ee-aa2c-9fa5ea67e703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 systemd-udevd[213378]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:52:05 np0005555140 NetworkManager[55531]: <info>  [1765446725.8151] device (tap0042166c-00): carrier: link connected
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.820 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[913876b8-5566-46d2-8136-a490f34915cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.838 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e1421c-b968-4742-ab6c-b0923f48b627]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0042166c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:4a:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308438, 'reachable_time': 42834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213397, 'error': None, 'target': 'ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.854 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccd6ff0-d66c-4861-95f5-7d9f5ef6bf72]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:4a7a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 308438, 'tstamp': 308438}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213405, 'error': None, 'target': 'ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 podman[213367]: 2025-12-11 09:52:05.862046476 +0000 UTC m=+0.074027540 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.872 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf8363e-96c3-4f1a-a04a-e73a542e2a85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0042166c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:4a:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308438, 'reachable_time': 42834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213413, 'error': None, 'target': 'ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.896 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[566b28ef-b824-4a39-8728-6aa7e30be3f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.942 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ac384e90-cff2-4c28-83b6-71df477af830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.944 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0042166c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.945 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.945 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0042166c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:05 np0005555140 kernel: tap0042166c-00: entered promiscuous mode
Dec 11 04:52:05 np0005555140 NetworkManager[55531]: <info>  [1765446725.9473] manager: (tap0042166c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.946 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.948 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.951 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0042166c-00, col_values=(('external_ids', {'iface-id': '4a5a3314-85c3-4857-9eae-abefe4fdeb92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:05Z|00031|binding|INFO|Releasing lport 4a5a3314-85c3-4857-9eae-abefe4fdeb92 from this chassis (sb_readonly=0)
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.952 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.952 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.953 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0042166c-0227-49c9-94ae-99b2fe7aec0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0042166c-0227-49c9-94ae-99b2fe7aec0a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.954 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf8fdf6-3953-410b-b549-4b3e1411fb34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.955 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-0042166c-0227-49c9-94ae-99b2fe7aec0a
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/0042166c-0227-49c9-94ae-99b2fe7aec0a.pid.haproxy
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 0042166c-0227-49c9-94ae-99b2fe7aec0a
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:52:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:05.956 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'env', 'PROCESS_TAG=haproxy-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0042166c-0227-49c9-94ae-99b2fe7aec0a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:52:05 np0005555140 nova_compute[187006]: 2025-12-11 09:52:05.963 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:06 np0005555140 podman[213445]: 2025-12-11 09:52:06.278438503 +0000 UTC m=+0.026183831 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:52:06 np0005555140 nova_compute[187006]: 2025-12-11 09:52:06.648 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:06 np0005555140 podman[213445]: 2025-12-11 09:52:06.833755006 +0000 UTC m=+0.581500314 container create f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:52:06 np0005555140 systemd[1]: Started libpod-conmon-f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321.scope.
Dec 11 04:52:06 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:52:06 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f28559ad21578d269d99ea579ed12ac1f3b12c83fbf73a62dd4e4203c90c526/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:52:07 np0005555140 podman[213445]: 2025-12-11 09:52:07.169083793 +0000 UTC m=+0.916829101 container init f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 04:52:07 np0005555140 podman[213445]: 2025-12-11 09:52:07.176512135 +0000 UTC m=+0.924257443 container start f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 04:52:07 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [NOTICE]   (213464) : New worker (213466) forked
Dec 11 04:52:07 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [NOTICE]   (213464) : Loading success.
Dec 11 04:52:07 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:07Z|00032|binding|INFO|Releasing lport 4a5a3314-85c3-4857-9eae-abefe4fdeb92 from this chassis (sb_readonly=0)
Dec 11 04:52:07 np0005555140 nova_compute[187006]: 2025-12-11 09:52:07.819 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8210] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8214] device (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <warn>  [1765446727.8216] device (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8229] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8232] device (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <warn>  [1765446727.8233] device (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8245] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8252] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8257] device (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 04:52:07 np0005555140 NetworkManager[55531]: <info>  [1765446727.8260] device (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 04:52:07 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:07Z|00033|binding|INFO|Releasing lport 4a5a3314-85c3-4857-9eae-abefe4fdeb92 from this chassis (sb_readonly=0)
Dec 11 04:52:07 np0005555140 nova_compute[187006]: 2025-12-11 09:52:07.863 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:07 np0005555140 nova_compute[187006]: 2025-12-11 09:52:07.869 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:08 np0005555140 nova_compute[187006]: 2025-12-11 09:52:08.191 187010 DEBUG nova.compute.manager [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:08 np0005555140 nova_compute[187006]: 2025-12-11 09:52:08.192 187010 DEBUG nova.compute.manager [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing instance network info cache due to event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:52:08 np0005555140 nova_compute[187006]: 2025-12-11 09:52:08.193 187010 DEBUG oslo_concurrency.lockutils [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:52:08 np0005555140 nova_compute[187006]: 2025-12-11 09:52:08.193 187010 DEBUG oslo_concurrency.lockutils [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:52:08 np0005555140 nova_compute[187006]: 2025-12-11 09:52:08.193 187010 DEBUG nova.network.neutron [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:52:09 np0005555140 podman[213477]: 2025-12-11 09:52:09.700906782 +0000 UTC m=+0.057747994 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=)
Dec 11 04:52:09 np0005555140 podman[213476]: 2025-12-11 09:52:09.717304961 +0000 UTC m=+0.086613040 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller)
Dec 11 04:52:09 np0005555140 nova_compute[187006]: 2025-12-11 09:52:09.787 187010 DEBUG nova.network.neutron [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updated VIF entry in instance network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:52:09 np0005555140 nova_compute[187006]: 2025-12-11 09:52:09.788 187010 DEBUG nova.network.neutron [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:52:09 np0005555140 nova_compute[187006]: 2025-12-11 09:52:09.808 187010 DEBUG oslo_concurrency.lockutils [req-e1585eb6-de7e-4156-b354-0182b843df46 req-9b8fcee7-c1ab-4563-a9a0-c97a202ad70c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:52:10 np0005555140 nova_compute[187006]: 2025-12-11 09:52:10.074 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:11 np0005555140 nova_compute[187006]: 2025-12-11 09:52:11.650 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:15 np0005555140 nova_compute[187006]: 2025-12-11 09:52:15.078 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:15 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:15Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:c8:c2 10.100.0.7
Dec 11 04:52:15 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:15Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:c8:c2 10.100.0.7
Dec 11 04:52:16 np0005555140 nova_compute[187006]: 2025-12-11 09:52:16.652 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:19 np0005555140 podman[213547]: 2025-12-11 09:52:19.680537422 +0000 UTC m=+0.057932739 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:52:20 np0005555140 nova_compute[187006]: 2025-12-11 09:52:20.082 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:21 np0005555140 nova_compute[187006]: 2025-12-11 09:52:21.653 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:22 np0005555140 nova_compute[187006]: 2025-12-11 09:52:22.838 187010 INFO nova.compute.manager [None req-683dad54-213e-41a7-95b8-159aff0155e2 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Get console output#033[00m
Dec 11 04:52:22 np0005555140 nova_compute[187006]: 2025-12-11 09:52:22.953 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:52:25 np0005555140 nova_compute[187006]: 2025-12-11 09:52:25.085 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:26 np0005555140 nova_compute[187006]: 2025-12-11 09:52:26.655 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:27 np0005555140 podman[213572]: 2025-12-11 09:52:27.698524334 +0000 UTC m=+0.063754006 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 04:52:27 np0005555140 podman[213571]: 2025-12-11 09:52:27.709681763 +0000 UTC m=+0.073307029 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 11 04:52:30 np0005555140 nova_compute[187006]: 2025-12-11 09:52:30.088 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:31 np0005555140 nova_compute[187006]: 2025-12-11 09:52:31.556 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:31 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:31.555 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:52:31 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:31.557 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:52:31 np0005555140 nova_compute[187006]: 2025-12-11 09:52:31.658 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:32 np0005555140 podman[213610]: 2025-12-11 09:52:32.676865063 +0000 UTC m=+0.050494597 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:52:35 np0005555140 nova_compute[187006]: 2025-12-11 09:52:35.092 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:36 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:36.558 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:36 np0005555140 nova_compute[187006]: 2025-12-11 09:52:36.660 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:36 np0005555140 podman[213629]: 2025-12-11 09:52:36.711717928 +0000 UTC m=+0.090901053 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:52:40 np0005555140 nova_compute[187006]: 2025-12-11 09:52:40.096 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:40 np0005555140 podman[213654]: 2025-12-11 09:52:40.703993915 +0000 UTC m=+0.074545794 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 11 04:52:40 np0005555140 podman[213653]: 2025-12-11 09:52:40.752650248 +0000 UTC m=+0.117041011 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.260 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.260 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.276 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.359 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.360 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.369 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.369 187010 INFO nova.compute.claims [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.516 187010 DEBUG nova.compute.provider_tree [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.544 187010 ERROR nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [req-a42b11a0-a3a3-44c4-8b9c-930be2b986c2] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID da0ef57a-f24e-4679-bba9-2f0d52d82a56.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a42b11a0-a3a3-44c4-8b9c-930be2b986c2"}]}#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.562 187010 DEBUG nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing inventories for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.591 187010 DEBUG nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating ProviderTree inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.592 187010 DEBUG nova.compute.provider_tree [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.606 187010 DEBUG nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing aggregate associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.627 187010 DEBUG nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing trait associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,HW_CPU_X86_SVM,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.665 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.684 187010 DEBUG nova.compute.provider_tree [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.726 187010 DEBUG nova.scheduler.client.report [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updated inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.727 187010 DEBUG nova.compute.provider_tree [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.727 187010 DEBUG nova.compute.provider_tree [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.751 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.752 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.809 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.810 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.835 187010 INFO nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.852 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.937 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.938 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.938 187010 INFO nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Creating image(s)#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.939 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.939 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.940 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:41 np0005555140 nova_compute[187006]: 2025-12-11 09:52:41.952 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.030 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.031 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.032 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.056 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.112 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.114 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.144 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.145 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.145 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.199 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.200 187010 DEBUG nova.virt.disk.api [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.200 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.268 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.269 187010 DEBUG nova.virt.disk.api [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.270 187010 DEBUG nova.objects.instance [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 651518ec-e299-4d5f-863a-12e51bae1a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.287 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.287 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Ensure instance console log exists: /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.288 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.288 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.288 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:42 np0005555140 nova_compute[187006]: 2025-12-11 09:52:42.585 187010 DEBUG nova.policy [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:52:43 np0005555140 nova_compute[187006]: 2025-12-11 09:52:43.866 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Successfully created port: e7d6b71c-5ecb-4782-9122-2ec367ad6583 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.872 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Successfully updated port: e7d6b71c-5ecb-4782-9122-2ec367ad6583 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.887 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.887 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.887 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.945 187010 DEBUG nova.compute.manager [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-changed-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.946 187010 DEBUG nova.compute.manager [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Refreshing instance network info cache due to event network-changed-e7d6b71c-5ecb-4782-9122-2ec367ad6583. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.946 187010 DEBUG oslo_concurrency.lockutils [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:52:44 np0005555140 nova_compute[187006]: 2025-12-11 09:52:44.990 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.100 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.541 187010 DEBUG nova.network.neutron [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Updating instance_info_cache with network_info: [{"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.592 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.593 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Instance network_info: |[{"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.594 187010 DEBUG oslo_concurrency.lockutils [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.594 187010 DEBUG nova.network.neutron [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Refreshing network info cache for port e7d6b71c-5ecb-4782-9122-2ec367ad6583 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.597 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Start _get_guest_xml network_info=[{"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.604 187010 WARNING nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.611 187010 DEBUG nova.virt.libvirt.host [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.612 187010 DEBUG nova.virt.libvirt.host [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.616 187010 DEBUG nova.virt.libvirt.host [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.617 187010 DEBUG nova.virt.libvirt.host [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.617 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.617 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.618 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.618 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.619 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.619 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.619 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.619 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.620 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.620 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.620 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.620 187010 DEBUG nova.virt.hardware [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.624 187010 DEBUG nova.virt.libvirt.vif [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-537395447',display_name='tempest-TestNetworkBasicOps-server-537395447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-537395447',id=2,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPH3GSzAGsvIFDkxK4NkeQpDEQcWEmpAw0dNONFvQN3MxEO0P804B5LHwjZepSnkIcZtwuoSlTXXu0YxSEqYgbgi5OgHp0gmJpKGzHU+lCQYDo22eNtAR6fMYUTqe8eeKw==',key_name='tempest-TestNetworkBasicOps-166466146',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-8tixd14h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:52:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=651518ec-e299-4d5f-863a-12e51bae1a51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.625 187010 DEBUG nova.network.os_vif_util [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.626 187010 DEBUG nova.network.os_vif_util [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.626 187010 DEBUG nova.objects.instance [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 651518ec-e299-4d5f-863a-12e51bae1a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.639 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <uuid>651518ec-e299-4d5f-863a-12e51bae1a51</uuid>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <name>instance-00000002</name>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-537395447</nova:name>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:52:45</nova:creationTime>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        <nova:port uuid="e7d6b71c-5ecb-4782-9122-2ec367ad6583">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="serial">651518ec-e299-4d5f-863a-12e51bae1a51</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="uuid">651518ec-e299-4d5f-863a-12e51bae1a51</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.config"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:27:e7:20"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <target dev="tape7d6b71c-5e"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/console.log" append="off"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:52:45 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:52:45 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:52:45 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:52:45 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.640 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Preparing to wait for external event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.641 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.641 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.641 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.642 187010 DEBUG nova.virt.libvirt.vif [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-537395447',display_name='tempest-TestNetworkBasicOps-server-537395447',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-537395447',id=2,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPH3GSzAGsvIFDkxK4NkeQpDEQcWEmpAw0dNONFvQN3MxEO0P804B5LHwjZepSnkIcZtwuoSlTXXu0YxSEqYgbgi5OgHp0gmJpKGzHU+lCQYDo22eNtAR6fMYUTqe8eeKw==',key_name='tempest-TestNetworkBasicOps-166466146',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-8tixd14h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:52:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=651518ec-e299-4d5f-863a-12e51bae1a51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.642 187010 DEBUG nova.network.os_vif_util [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.643 187010 DEBUG nova.network.os_vif_util [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.644 187010 DEBUG os_vif [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.644 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.645 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.646 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.651 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.651 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7d6b71c-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.652 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7d6b71c-5e, col_values=(('external_ids', {'iface-id': 'e7d6b71c-5ecb-4782-9122-2ec367ad6583', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:e7:20', 'vm-uuid': '651518ec-e299-4d5f-863a-12e51bae1a51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.654 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:45 np0005555140 NetworkManager[55531]: <info>  [1765446765.6552] manager: (tape7d6b71c-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.656 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.661 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.662 187010 INFO os_vif [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e')#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.715 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.715 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.716 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:27:e7:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.717 187010 INFO nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Using config drive#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.949 187010 INFO nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Creating config drive at /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.config#033[00m
Dec 11 04:52:45 np0005555140 nova_compute[187006]: 2025-12-11 09:52:45.956 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpakifq4dy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.091 187010 DEBUG oslo_concurrency.processutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpakifq4dy" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:46 np0005555140 kernel: tape7d6b71c-5e: entered promiscuous mode
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.1484] manager: (tape7d6b71c-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Dec 11 04:52:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:46Z|00034|binding|INFO|Claiming lport e7d6b71c-5ecb-4782-9122-2ec367ad6583 for this chassis.
Dec 11 04:52:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:46Z|00035|binding|INFO|e7d6b71c-5ecb-4782-9122-2ec367ad6583: Claiming fa:16:3e:27:e7:20 10.100.0.22
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.149 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.173 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:e7:20 10.100.0.22'], port_security=['fa:16:3e:27:e7:20 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '651518ec-e299-4d5f-863a-12e51bae1a51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-157f9dbd-804b-4907-8bda-9737c94aaa14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '409f37ed-73b8-4be3-909f-363f330e2f09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a58ef98-6de8-4458-bfe3-a0f3181d91df, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=e7d6b71c-5ecb-4782-9122-2ec367ad6583) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.175 104288 INFO neutron.agent.ovn.metadata.agent [-] Port e7d6b71c-5ecb-4782-9122-2ec367ad6583 in datapath 157f9dbd-804b-4907-8bda-9737c94aaa14 bound to our chassis#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.176 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 157f9dbd-804b-4907-8bda-9737c94aaa14#033[00m
Dec 11 04:52:46 np0005555140 systemd-udevd[213737]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.183 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 systemd-machined[153398]: New machine qemu-2-instance-00000002.
Dec 11 04:52:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:46Z|00036|binding|INFO|Setting lport e7d6b71c-5ecb-4782-9122-2ec367ad6583 ovn-installed in OVS
Dec 11 04:52:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:46Z|00037|binding|INFO|Setting lport e7d6b71c-5ecb-4782-9122-2ec367ad6583 up in Southbound
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.187 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[057d6dbd-9952-4928-a642-e015251da978]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.188 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap157f9dbd-81 in ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.190 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap157f9dbd-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.190 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[fef8278b-c19f-4315-8d95-9a4ca73b6884]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.190 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.191 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4579cbce-0cfb-4b15-9e58-fef6dae27f24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.1940] device (tape7d6b71c-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:52:46 np0005555140 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.1952] device (tape7d6b71c-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.213 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[690e8321-b469-4dd2-a4bc-9ca7e53ce906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.226 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[db43a6f0-7915-42d4-b586-9db063811eed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.249 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[675b71a3-38f9-4321-a709-3742ada94ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 systemd-udevd[213741]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.254 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9e67844c-e3f5-46d2-a3e6-ef648eef4b90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.2551] manager: (tap157f9dbd-80): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.279 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[104055b2-7526-4318-bac3-a1fd99f4559a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.281 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e37e9d-6598-4b91-afd5-3c6b045a232f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.2997] device (tap157f9dbd-80): carrier: link connected
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.304 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[06e9a9df-6189-4f63-8f8c-5d6b4f229dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.320 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[36715a28-32e7-4d8e-9282-3784bfd12883]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap157f9dbd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:f9:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312486, 'reachable_time': 39991, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213771, 'error': None, 'target': 'ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.332 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9afc90-5fde-4ce0-9eb1-45766c6c6654]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:f977'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 312486, 'tstamp': 312486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213772, 'error': None, 'target': 'ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.346 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f96ec7-de23-47e8-aa3b-9ee184c337b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap157f9dbd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:f9:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312486, 'reachable_time': 39991, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213773, 'error': None, 'target': 'ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.371 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9515a533-7f6c-45dd-90ad-8ccd6fd06cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.415 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff2754b-1518-44a2-a0e5-b8f473b0215d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.416 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap157f9dbd-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.416 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.417 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap157f9dbd-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:46 np0005555140 kernel: tap157f9dbd-80: entered promiscuous mode
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.418 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 NetworkManager[55531]: <info>  [1765446766.4201] manager: (tap157f9dbd-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.420 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.422 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap157f9dbd-80, col_values=(('external_ids', {'iface-id': '051522b2-3136-4382-9233-7146be317ccb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.423 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:46Z|00038|binding|INFO|Releasing lport 051522b2-3136-4382-9233-7146be317ccb from this chassis (sb_readonly=0)
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.423 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.425 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/157f9dbd-804b-4907-8bda-9737c94aaa14.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/157f9dbd-804b-4907-8bda-9737c94aaa14.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.426 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2017d9ca-e09a-41d5-8d93-5e930576013f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.426 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-157f9dbd-804b-4907-8bda-9737c94aaa14
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/157f9dbd-804b-4907-8bda-9737c94aaa14.pid.haproxy
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 157f9dbd-804b-4907-8bda-9737c94aaa14
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:52:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:46.427 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14', 'env', 'PROCESS_TAG=haproxy-157f9dbd-804b-4907-8bda-9737c94aaa14', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/157f9dbd-804b-4907-8bda-9737c94aaa14.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.434 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.578 187010 DEBUG nova.network.neutron [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Updated VIF entry in instance network info cache for port e7d6b71c-5ecb-4782-9122-2ec367ad6583. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.579 187010 DEBUG nova.network.neutron [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Updating instance_info_cache with network_info: [{"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.594 187010 DEBUG oslo_concurrency.lockutils [req-71f6a884-1e10-4575-a19e-ab8b28da384a req-23c2b6fb-a8b9-49e7-88c3-e7887f2783f0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-651518ec-e299-4d5f-863a-12e51bae1a51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.663 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.738 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446766.7374094, 651518ec-e299-4d5f-863a-12e51bae1a51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.739 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] VM Started (Lifecycle Event)#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.758 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.763 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446766.7376373, 651518ec-e299-4d5f-863a-12e51bae1a51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.763 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.786 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.791 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:52:46 np0005555140 nova_compute[187006]: 2025-12-11 09:52:46.812 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:52:46 np0005555140 podman[213813]: 2025-12-11 09:52:46.848496947 +0000 UTC m=+0.077058306 container create 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:52:46 np0005555140 systemd[1]: Started libpod-conmon-27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce.scope.
Dec 11 04:52:46 np0005555140 podman[213813]: 2025-12-11 09:52:46.812180058 +0000 UTC m=+0.040741467 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:52:46 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:52:46 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f515bebc18471bde31744bb66a26303b7c42f1d3c6ab49264c6d204d4c03287e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:52:46 np0005555140 podman[213813]: 2025-12-11 09:52:46.952925986 +0000 UTC m=+0.181487335 container init 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 04:52:46 np0005555140 podman[213813]: 2025-12-11 09:52:46.957725053 +0000 UTC m=+0.186286392 container start 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 11 04:52:46 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [NOTICE]   (213832) : New worker (213834) forked
Dec 11 04:52:46 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [NOTICE]   (213832) : Loading success.
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.016 187010 DEBUG nova.compute.manager [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.016 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.016 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.017 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.017 187010 DEBUG nova.compute.manager [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Processing event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.017 187010 DEBUG nova.compute.manager [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.017 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.018 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.018 187010 DEBUG oslo_concurrency.lockutils [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.018 187010 DEBUG nova.compute.manager [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] No waiting events found dispatching network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.018 187010 WARNING nova.compute.manager [req-8c826fd2-f3a0-4512-9f66-0ef46bed1207 req-e1e2c4e9-2b4f-41b9-a7e8-3d6e535d615d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received unexpected event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.019 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.023 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446767.0229213, 651518ec-e299-4d5f-863a-12e51bae1a51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.023 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.025 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.028 187010 INFO nova.virt.libvirt.driver [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Instance spawned successfully.#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.029 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.040 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.045 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.076 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.087 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.088 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.088 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.088 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.089 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.089 187010 DEBUG nova.virt.libvirt.driver [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.134 187010 INFO nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Took 5.20 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.134 187010 DEBUG nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.182 187010 INFO nova.compute.manager [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Took 5.86 seconds to build instance.#033[00m
Dec 11 04:52:47 np0005555140 nova_compute[187006]: 2025-12-11 09:52:47.198 187010 DEBUG oslo_concurrency.lockutils [None req-541295a4-e883-4b9d-9716-ea64d5e3ac10 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:48.616 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:48.617 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:52:48.617 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:48 np0005555140 nova_compute[187006]: 2025-12-11 09:52:48.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:50 np0005555140 nova_compute[187006]: 2025-12-11 09:52:50.657 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:50 np0005555140 podman[213843]: 2025-12-11 09:52:50.686781316 +0000 UTC m=+0.053654747 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:52:50 np0005555140 nova_compute[187006]: 2025-12-11 09:52:50.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:51 np0005555140 nova_compute[187006]: 2025-12-11 09:52:51.667 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:52 np0005555140 nova_compute[187006]: 2025-12-11 09:52:52.589 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:52 np0005555140 nova_compute[187006]: 2025-12-11 09:52:52.589 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:52:52 np0005555140 nova_compute[187006]: 2025-12-11 09:52:52.590 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:52:53 np0005555140 nova_compute[187006]: 2025-12-11 09:52:53.589 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:52:53 np0005555140 nova_compute[187006]: 2025-12-11 09:52:53.589 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:52:53 np0005555140 nova_compute[187006]: 2025-12-11 09:52:53.590 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 04:52:53 np0005555140 nova_compute[187006]: 2025-12-11 09:52:53.590 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 268e9416-7794-482e-8290-95afc792d4c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:52:55 np0005555140 nova_compute[187006]: 2025-12-11 09:52:55.662 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.379 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.396 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.397 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.398 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.399 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.399 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.400 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.400 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.401 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.401 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.422 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.423 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.423 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.424 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.500 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.588 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.589 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.655 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.663 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.678 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.714 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.716 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.773 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.964 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.966 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5449MB free_disk=73.30279922485352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.966 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:52:56 np0005555140 nova_compute[187006]: 2025-12-11 09:52:56.966 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.060 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 268e9416-7794-482e-8290-95afc792d4c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.061 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 651518ec-e299-4d5f-863a-12e51bae1a51 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.061 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.061 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.146 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.167 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.192 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:52:57 np0005555140 nova_compute[187006]: 2025-12-11 09:52:57.193 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:52:57 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:57Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:e7:20 10.100.0.22
Dec 11 04:52:57 np0005555140 ovn_controller[95438]: 2025-12-11T09:52:57Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:e7:20 10.100.0.22
Dec 11 04:52:58 np0005555140 nova_compute[187006]: 2025-12-11 09:52:58.194 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:52:58 np0005555140 podman[213897]: 2025-12-11 09:52:58.687040841 +0000 UTC m=+0.058976186 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:52:58 np0005555140 podman[213896]: 2025-12-11 09:52:58.695158451 +0000 UTC m=+0.067325783 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:00 np0005555140 nova_compute[187006]: 2025-12-11 09:53:00.666 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:01 np0005555140 nova_compute[187006]: 2025-12-11 09:53:01.670 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:03 np0005555140 podman[213934]: 2025-12-11 09:53:03.692627673 +0000 UTC m=+0.050872525 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.675 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.935 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.935 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.936 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.936 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.936 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.938 187010 INFO nova.compute.manager [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Terminating instance#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.938 187010 DEBUG nova.compute.manager [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:53:05 np0005555140 kernel: tape7d6b71c-5e (unregistering): left promiscuous mode
Dec 11 04:53:05 np0005555140 NetworkManager[55531]: <info>  [1765446785.9654] device (tape7d6b71c-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:53:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:05Z|00039|binding|INFO|Releasing lport e7d6b71c-5ecb-4782-9122-2ec367ad6583 from this chassis (sb_readonly=0)
Dec 11 04:53:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:05Z|00040|binding|INFO|Setting lport e7d6b71c-5ecb-4782-9122-2ec367ad6583 down in Southbound
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.976 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:05Z|00041|binding|INFO|Removing iface tape7d6b71c-5e ovn-installed in OVS
Dec 11 04:53:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:05.986 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:e7:20 10.100.0.22'], port_security=['fa:16:3e:27:e7:20 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '651518ec-e299-4d5f-863a-12e51bae1a51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-157f9dbd-804b-4907-8bda-9737c94aaa14', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '409f37ed-73b8-4be3-909f-363f330e2f09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a58ef98-6de8-4458-bfe3-a0f3181d91df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=e7d6b71c-5ecb-4782-9122-2ec367ad6583) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:53:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:05.988 104288 INFO neutron.agent.ovn.metadata.agent [-] Port e7d6b71c-5ecb-4782-9122-2ec367ad6583 in datapath 157f9dbd-804b-4907-8bda-9737c94aaa14 unbound from our chassis#033[00m
Dec 11 04:53:05 np0005555140 nova_compute[187006]: 2025-12-11 09:53:05.989 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:05.989 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 157f9dbd-804b-4907-8bda-9737c94aaa14, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:53:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:05.990 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2a528e5e-d761-41c9-937f-431d51b01598]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:05.991 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14 namespace which is not needed anymore#033[00m
Dec 11 04:53:06 np0005555140 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 11 04:53:06 np0005555140 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 12.061s CPU time.
Dec 11 04:53:06 np0005555140 systemd-machined[153398]: Machine qemu-2-instance-00000002 terminated.
Dec 11 04:53:06 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [NOTICE]   (213832) : haproxy version is 2.8.14-c23fe91
Dec 11 04:53:06 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [NOTICE]   (213832) : path to executable is /usr/sbin/haproxy
Dec 11 04:53:06 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [WARNING]  (213832) : Exiting Master process...
Dec 11 04:53:06 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [ALERT]    (213832) : Current worker (213834) exited with code 143 (Terminated)
Dec 11 04:53:06 np0005555140 neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14[213828]: [WARNING]  (213832) : All workers exited. Exiting... (0)
Dec 11 04:53:06 np0005555140 systemd[1]: libpod-27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce.scope: Deactivated successfully.
Dec 11 04:53:06 np0005555140 podman[213977]: 2025-12-11 09:53:06.120599449 +0000 UTC m=+0.050968118 container died 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:53:06 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce-userdata-shm.mount: Deactivated successfully.
Dec 11 04:53:06 np0005555140 systemd[1]: var-lib-containers-storage-overlay-f515bebc18471bde31744bb66a26303b7c42f1d3c6ab49264c6d204d4c03287e-merged.mount: Deactivated successfully.
Dec 11 04:53:06 np0005555140 podman[213977]: 2025-12-11 09:53:06.150318463 +0000 UTC m=+0.080687142 container cleanup 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.163 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 systemd[1]: libpod-conmon-27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce.scope: Deactivated successfully.
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.168 187010 DEBUG nova.compute.manager [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-unplugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.169 187010 DEBUG oslo_concurrency.lockutils [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.169 187010 DEBUG oslo_concurrency.lockutils [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.169 187010 DEBUG oslo_concurrency.lockutils [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.170 187010 DEBUG nova.compute.manager [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] No waiting events found dispatching network-vif-unplugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.170 187010 DEBUG nova.compute.manager [req-abb42646-1a22-416b-9888-d8e124447bb0 req-0d15e6fa-dc8e-47fd-b6f3-3eee0ae9289b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-unplugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.171 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.199 187010 INFO nova.virt.libvirt.driver [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Instance destroyed successfully.#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.202 187010 DEBUG nova.objects.instance [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 651518ec-e299-4d5f-863a-12e51bae1a51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.219 187010 DEBUG nova.virt.libvirt.vif [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-537395447',display_name='tempest-TestNetworkBasicOps-server-537395447',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-537395447',id=2,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPH3GSzAGsvIFDkxK4NkeQpDEQcWEmpAw0dNONFvQN3MxEO0P804B5LHwjZepSnkIcZtwuoSlTXXu0YxSEqYgbgi5OgHp0gmJpKGzHU+lCQYDo22eNtAR6fMYUTqe8eeKw==',key_name='tempest-TestNetworkBasicOps-166466146',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:52:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-8tixd14h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:52:47Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=651518ec-e299-4d5f-863a-12e51bae1a51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.220 187010 DEBUG nova.network.os_vif_util [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "address": "fa:16:3e:27:e7:20", "network": {"id": "157f9dbd-804b-4907-8bda-9737c94aaa14", "bridge": "br-int", "label": "tempest-network-smoke--1546880206", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7d6b71c-5e", "ovs_interfaceid": "e7d6b71c-5ecb-4782-9122-2ec367ad6583", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.220 187010 DEBUG nova.network.os_vif_util [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.221 187010 DEBUG os_vif [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.223 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.223 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7d6b71c-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.224 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 podman[214008]: 2025-12-11 09:53:06.226248279 +0000 UTC m=+0.048096016 container remove 27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.227 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.230 187010 INFO os_vif [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e7:20,bridge_name='br-int',has_traffic_filtering=True,id=e7d6b71c-5ecb-4782-9122-2ec367ad6583,network=Network(157f9dbd-804b-4907-8bda-9737c94aaa14),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7d6b71c-5e')#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.230 187010 INFO nova.virt.libvirt.driver [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Deleting instance files /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51_del#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.231 187010 INFO nova.virt.libvirt.driver [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Deletion of /var/lib/nova/instances/651518ec-e299-4d5f-863a-12e51bae1a51_del complete#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.231 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cd88d766-4e13-4f4e-9a09-40fc3733f130]: (4, ('Thu Dec 11 09:53:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14 (27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce)\n27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce\nThu Dec 11 09:53:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14 (27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce)\n27e8082fda0e72a68f32852058ecb45ac7e6262549da6ce919aa61b29acab5ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.232 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[caffa7d0-6be0-4df9-80be-d3ea35d542c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.233 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap157f9dbd-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.235 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 kernel: tap157f9dbd-80: left promiscuous mode
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.247 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.249 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[04e73c86-798c-4d6e-b3d9-9af31cee5c99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.263 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[277a516e-618b-4c21-8fb9-da5970f2974b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.264 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3887ffc2-0bc1-4a7d-b7a0-1d033d8e733f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.279 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cde254-00a2-467b-84ce-ef4cf1ac8ea9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312481, 'reachable_time': 31638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214033, 'error': None, 'target': 'ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.290 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-157f9dbd-804b-4907-8bda-9737c94aaa14 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:53:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:06.291 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[05248d75-ef4b-4eb3-81fc-f86d358319ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:06 np0005555140 systemd[1]: run-netns-ovnmeta\x2d157f9dbd\x2d804b\x2d4907\x2d8bda\x2d9737c94aaa14.mount: Deactivated successfully.
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.295 187010 DEBUG nova.virt.libvirt.host [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.295 187010 INFO nova.virt.libvirt.host [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] UEFI support detected#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.297 187010 INFO nova.compute.manager [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.297 187010 DEBUG oslo.service.loopingcall [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.297 187010 DEBUG nova.compute.manager [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.298 187010 DEBUG nova.network.neutron [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.675 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.954 187010 DEBUG nova.network.neutron [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:06 np0005555140 nova_compute[187006]: 2025-12-11 09:53:06.972 187010 INFO nova.compute.manager [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Took 0.67 seconds to deallocate network for instance.#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.021 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.021 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.042 187010 DEBUG nova.compute.manager [req-637d5e59-9374-475a-91dc-b7f96be656bb req-d294b5f3-6692-471d-8c31-0f0c61478ee4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-deleted-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.092 187010 DEBUG nova.compute.provider_tree [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.106 187010 DEBUG nova.scheduler.client.report [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.131 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.163 187010 INFO nova.scheduler.client.report [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 651518ec-e299-4d5f-863a-12e51bae1a51#033[00m
Dec 11 04:53:07 np0005555140 nova_compute[187006]: 2025-12-11 09:53:07.261 187010 DEBUG oslo_concurrency.lockutils [None req-6033da93-c950-411d-9382-ce339cf21f49 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:07 np0005555140 podman[214035]: 2025-12-11 09:53:07.690751926 +0000 UTC m=+0.057932196 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.286 187010 DEBUG nova.compute.manager [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.287 187010 DEBUG oslo_concurrency.lockutils [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.287 187010 DEBUG oslo_concurrency.lockutils [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.288 187010 DEBUG oslo_concurrency.lockutils [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "651518ec-e299-4d5f-863a-12e51bae1a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.288 187010 DEBUG nova.compute.manager [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] No waiting events found dispatching network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:53:08 np0005555140 nova_compute[187006]: 2025-12-11 09:53:08.288 187010 WARNING nova.compute.manager [req-1561e875-a51e-4575-b323-9566392167b1 req-304a605d-a2bd-4fa5-9e3a-d73e1f34cef1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Received unexpected event network-vif-plugged-e7d6b71c-5ecb-4782-9122-2ec367ad6583 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:53:09 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:09Z|00042|binding|INFO|Releasing lport 4a5a3314-85c3-4857-9eae-abefe4fdeb92 from this chassis (sb_readonly=0)
Dec 11 04:53:09 np0005555140 nova_compute[187006]: 2025-12-11 09:53:09.075 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.219 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.221 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.221 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.222 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.222 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.224 187010 INFO nova.compute.manager [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Terminating instance#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.226 187010 DEBUG nova.compute.manager [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:53:10 np0005555140 kernel: tap98230026-4a (unregistering): left promiscuous mode
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.256 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 NetworkManager[55531]: <info>  [1765446790.2569] device (tap98230026-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:53:10 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:10Z|00043|binding|INFO|Releasing lport 98230026-4a3b-410e-9e9e-9a7b06879680 from this chassis (sb_readonly=0)
Dec 11 04:53:10 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:10Z|00044|binding|INFO|Setting lport 98230026-4a3b-410e-9e9e-9a7b06879680 down in Southbound
Dec 11 04:53:10 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:10Z|00045|binding|INFO|Removing iface tap98230026-4a ovn-installed in OVS
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.260 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.265 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:c8:c2 10.100.0.7'], port_security=['fa:16:3e:a8:c8:c2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '268e9416-7794-482e-8290-95afc792d4c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd55fe206-2b00-4f4d-9918-4d5d81a7b847', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c43167a1-3764-4b58-b022-3678644ed4a5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=98230026-4a3b-410e-9e9e-9a7b06879680) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.266 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 98230026-4a3b-410e-9e9e-9a7b06879680 in datapath 0042166c-0227-49c9-94ae-99b2fe7aec0a unbound from our chassis#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.267 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0042166c-0227-49c9-94ae-99b2fe7aec0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.267 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[db5bd642-ed75-4155-9384-6fb0cce2ec19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.268 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a namespace which is not needed anymore#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.290 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 11 04:53:10 np0005555140 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.078s CPU time.
Dec 11 04:53:10 np0005555140 systemd-machined[153398]: Machine qemu-1-instance-00000001 terminated.
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.354 187010 DEBUG nova.compute.manager [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.356 187010 DEBUG nova.compute.manager [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing instance network info cache due to event network-changed-98230026-4a3b-410e-9e9e-9a7b06879680. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.357 187010 DEBUG oslo_concurrency.lockutils [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.357 187010 DEBUG oslo_concurrency.lockutils [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.357 187010 DEBUG nova.network.neutron [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Refreshing network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [NOTICE]   (213464) : haproxy version is 2.8.14-c23fe91
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [NOTICE]   (213464) : path to executable is /usr/sbin/haproxy
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [WARNING]  (213464) : Exiting Master process...
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [WARNING]  (213464) : Exiting Master process...
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [ALERT]    (213464) : Current worker (213466) exited with code 143 (Terminated)
Dec 11 04:53:10 np0005555140 neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a[213460]: [WARNING]  (213464) : All workers exited. Exiting... (0)
Dec 11 04:53:10 np0005555140 systemd[1]: libpod-f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321.scope: Deactivated successfully.
Dec 11 04:53:10 np0005555140 podman[214083]: 2025-12-11 09:53:10.409150558 +0000 UTC m=+0.046644015 container died f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:53:10 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321-userdata-shm.mount: Deactivated successfully.
Dec 11 04:53:10 np0005555140 systemd[1]: var-lib-containers-storage-overlay-4f28559ad21578d269d99ea579ed12ac1f3b12c83fbf73a62dd4e4203c90c526-merged.mount: Deactivated successfully.
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.448 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.453 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 podman[214083]: 2025-12-11 09:53:10.456963106 +0000 UTC m=+0.094456563 container cleanup f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 04:53:10 np0005555140 systemd[1]: libpod-conmon-f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321.scope: Deactivated successfully.
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.491 187010 INFO nova.virt.libvirt.driver [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Instance destroyed successfully.#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.492 187010 DEBUG nova.objects.instance [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 268e9416-7794-482e-8290-95afc792d4c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.509 187010 DEBUG nova.virt.libvirt.vif [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1217004400',display_name='tempest-TestNetworkBasicOps-server-1217004400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1217004400',id=1,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGJ6bz+VXXSH81a0VHz7rgBFE29m5WU0Y/vAsmHz3b31tXEehN7HVhY89KfeMsMIWFiwKd0Dv1bEIWabt+cel+h24NvALwfU+elUZgsFh6eWOkAB0Ht7CHhvmXSV8HY79A==',key_name='tempest-TestNetworkBasicOps-1693520131',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:52:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rhc7zd77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:52:03Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=268e9416-7794-482e-8290-95afc792d4c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.510 187010 DEBUG nova.network.os_vif_util [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.511 187010 DEBUG nova.network.os_vif_util [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.511 187010 DEBUG os_vif [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.513 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.513 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98230026-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.514 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.516 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.519 187010 INFO os_vif [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:c8:c2,bridge_name='br-int',has_traffic_filtering=True,id=98230026-4a3b-410e-9e9e-9a7b06879680,network=Network(0042166c-0227-49c9-94ae-99b2fe7aec0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98230026-4a')#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.520 187010 INFO nova.virt.libvirt.driver [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Deleting instance files /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1_del#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.521 187010 INFO nova.virt.libvirt.driver [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Deletion of /var/lib/nova/instances/268e9416-7794-482e-8290-95afc792d4c1_del complete#033[00m
Dec 11 04:53:10 np0005555140 podman[214123]: 2025-12-11 09:53:10.528324532 +0000 UTC m=+0.048121517 container remove f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.534 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[46ea39ad-4118-4beb-97b7-10aef4d890b2]: (4, ('Thu Dec 11 09:53:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a (f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321)\nf8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321\nThu Dec 11 09:53:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a (f8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321)\nf8158ac9192ad219531513754defaea111e562acd7ef27b1442210df5a06e321\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.535 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[997a7e0e-735f-49f6-8701-5ca49118a904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.536 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0042166c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.537 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 kernel: tap0042166c-00: left promiscuous mode
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.548 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.551 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[48606d09-9fbf-4992-8ccd-103d067600fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.562 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4a74b72d-8a1f-4625-aa42-2e7133e6197b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.563 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9d25a586-c190-464d-92dc-107566235a7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.580 187010 INFO nova.compute.manager [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.580 187010 DEBUG oslo.service.loopingcall [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.581 187010 DEBUG nova.compute.manager [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:53:10 np0005555140 nova_compute[187006]: 2025-12-11 09:53:10.581 187010 DEBUG nova.network.neutron [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.580 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d0eacef8-29a2-496b-b5da-7bedaa1e8aab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308427, 'reachable_time': 18742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214144, 'error': None, 'target': 'ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.583 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0042166c-0227-49c9-94ae-99b2fe7aec0a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:53:10 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:10.583 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[70b2d8b6-5880-4404-912d-96ac357c7052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:10 np0005555140 systemd[1]: run-netns-ovnmeta\x2d0042166c\x2d0227\x2d49c9\x2d94ae\x2d99b2fe7aec0a.mount: Deactivated successfully.
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.559 187010 DEBUG nova.network.neutron [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.579 187010 INFO nova.compute.manager [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Took 1.00 seconds to deallocate network for instance.#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.620 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.621 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.673 187010 DEBUG nova.compute.provider_tree [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.676 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:11 np0005555140 podman[214146]: 2025-12-11 09:53:11.67907951 +0000 UTC m=+0.054263002 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6)
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.692 187010 DEBUG nova.scheduler.client.report [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:53:11 np0005555140 podman[214145]: 2025-12-11 09:53:11.708217197 +0000 UTC m=+0.083724668 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.713 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.741 187010 INFO nova.scheduler.client.report [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 268e9416-7794-482e-8290-95afc792d4c1#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.802 187010 DEBUG oslo_concurrency.lockutils [None req-2fcabc14-4e67-4c87-b552-7fb94610310c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.963 187010 DEBUG nova.network.neutron [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updated VIF entry in instance network info cache for port 98230026-4a3b-410e-9e9e-9a7b06879680. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.964 187010 DEBUG nova.network.neutron [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Updating instance_info_cache with network_info: [{"id": "98230026-4a3b-410e-9e9e-9a7b06879680", "address": "fa:16:3e:a8:c8:c2", "network": {"id": "0042166c-0227-49c9-94ae-99b2fe7aec0a", "bridge": "br-int", "label": "tempest-network-smoke--730270468", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap98230026-4a", "ovs_interfaceid": "98230026-4a3b-410e-9e9e-9a7b06879680", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:11 np0005555140 nova_compute[187006]: 2025-12-11 09:53:11.981 187010 DEBUG oslo_concurrency.lockutils [req-72825dee-f452-4d69-814f-603d16eecaf6 req-a64e980e-cc63-4c0e-b2a1-c1454843d5de b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-268e9416-7794-482e-8290-95afc792d4c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.427 187010 DEBUG nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-vif-unplugged-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.427 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.427 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.427 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 DEBUG nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] No waiting events found dispatching network-vif-unplugged-98230026-4a3b-410e-9e9e-9a7b06879680 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 WARNING nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received unexpected event network-vif-unplugged-98230026-4a3b-410e-9e9e-9a7b06879680 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 DEBUG nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "268e9416-7794-482e-8290-95afc792d4c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.428 187010 DEBUG oslo_concurrency.lockutils [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "268e9416-7794-482e-8290-95afc792d4c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.429 187010 DEBUG nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] No waiting events found dispatching network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.429 187010 WARNING nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received unexpected event network-vif-plugged-98230026-4a3b-410e-9e9e-9a7b06879680 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:53:12 np0005555140 nova_compute[187006]: 2025-12-11 09:53:12.429 187010 DEBUG nova.compute.manager [req-3b8c64f3-075b-4c28-a8a5-76a4fd8267b2 req-201f632b-0966-4c6f-8a35-a9b2b838ad4a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Received event network-vif-deleted-98230026-4a3b-410e-9e9e-9a7b06879680 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:14 np0005555140 nova_compute[187006]: 2025-12-11 09:53:14.887 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:15 np0005555140 nova_compute[187006]: 2025-12-11 09:53:15.001 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:15 np0005555140 nova_compute[187006]: 2025-12-11 09:53:15.516 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:16 np0005555140 nova_compute[187006]: 2025-12-11 09:53:16.679 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:20 np0005555140 nova_compute[187006]: 2025-12-11 09:53:20.520 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:21 np0005555140 nova_compute[187006]: 2025-12-11 09:53:21.198 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765446786.1965585, 651518ec-e299-4d5f-863a-12e51bae1a51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:53:21 np0005555140 nova_compute[187006]: 2025-12-11 09:53:21.199 187010 INFO nova.compute.manager [-] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:53:21 np0005555140 nova_compute[187006]: 2025-12-11 09:53:21.224 187010 DEBUG nova.compute.manager [None req-947725ca-6c7a-4d88-b2e7-261f08a10e72 - - - - - -] [instance: 651518ec-e299-4d5f-863a-12e51bae1a51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:21 np0005555140 nova_compute[187006]: 2025-12-11 09:53:21.681 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:21 np0005555140 podman[214192]: 2025-12-11 09:53:21.729656968 +0000 UTC m=+0.099275440 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:53:25 np0005555140 nova_compute[187006]: 2025-12-11 09:53:25.489 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765446790.4887176, 268e9416-7794-482e-8290-95afc792d4c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:53:25 np0005555140 nova_compute[187006]: 2025-12-11 09:53:25.491 187010 INFO nova.compute.manager [-] [instance: 268e9416-7794-482e-8290-95afc792d4c1] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:53:25 np0005555140 nova_compute[187006]: 2025-12-11 09:53:25.524 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:25 np0005555140 nova_compute[187006]: 2025-12-11 09:53:25.648 187010 DEBUG nova.compute.manager [None req-f8b54c12-9506-4deb-9c10-c73a392ed9c0 - - - - - -] [instance: 268e9416-7794-482e-8290-95afc792d4c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:26 np0005555140 nova_compute[187006]: 2025-12-11 09:53:26.683 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:29 np0005555140 podman[214217]: 2025-12-11 09:53:29.702762403 +0000 UTC m=+0.068594219 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 11 04:53:29 np0005555140 podman[214218]: 2025-12-11 09:53:29.709270968 +0000 UTC m=+0.068674351 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:30 np0005555140 nova_compute[187006]: 2025-12-11 09:53:30.529 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:31 np0005555140 nova_compute[187006]: 2025-12-11 09:53:31.685 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:34 np0005555140 podman[214257]: 2025-12-11 09:53:34.683239899 +0000 UTC m=+0.054938361 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 04:53:35 np0005555140 nova_compute[187006]: 2025-12-11 09:53:35.533 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:36 np0005555140 nova_compute[187006]: 2025-12-11 09:53:36.686 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:37 np0005555140 nova_compute[187006]: 2025-12-11 09:53:37.906 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:37 np0005555140 nova_compute[187006]: 2025-12-11 09:53:37.906 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:37 np0005555140 nova_compute[187006]: 2025-12-11 09:53:37.928 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.017 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.018 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.026 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.026 187010 INFO nova.compute.claims [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.233 187010 DEBUG nova.compute.provider_tree [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.247 187010 DEBUG nova.scheduler.client.report [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.269 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.270 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.314 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.314 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.333 187010 INFO nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.349 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.448 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.449 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.449 187010 INFO nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Creating image(s)#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.450 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.450 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.451 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.465 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.527 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.528 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.529 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.540 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.568 187010 DEBUG nova.policy [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:53:38 np0005555140 podman[214280]: 2025-12-11 09:53:38.598128497 +0000 UTC m=+0.061488987 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.614 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.615 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.658 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.660 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.660 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.720 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.721 187010 DEBUG nova.virt.disk.api [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.722 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.783 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.786 187010 DEBUG nova.virt.disk.api [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.786 187010 DEBUG nova.objects.instance [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.802 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.803 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Ensure instance console log exists: /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.804 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.804 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:38 np0005555140 nova_compute[187006]: 2025-12-11 09:53:38.805 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:39.787 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:53:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:39.788 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:53:39 np0005555140 nova_compute[187006]: 2025-12-11 09:53:39.788 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:40 np0005555140 nova_compute[187006]: 2025-12-11 09:53:40.535 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:40 np0005555140 nova_compute[187006]: 2025-12-11 09:53:40.543 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Successfully created port: 03b48f25-c36b-4787-921d-4208c750d11b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.289 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Successfully updated port: 03b48f25-c36b-4787-921d-4208c750d11b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.324 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.325 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.325 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.376 187010 DEBUG nova.compute.manager [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-changed-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.376 187010 DEBUG nova.compute.manager [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing instance network info cache due to event network-changed-03b48f25-c36b-4787-921d-4208c750d11b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.377 187010 DEBUG oslo_concurrency.lockutils [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.452 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:53:41 np0005555140 nova_compute[187006]: 2025-12-11 09:53:41.688 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.569 187010 DEBUG nova.network.neutron [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.595 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.596 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Instance network_info: |[{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.596 187010 DEBUG oslo_concurrency.lockutils [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.596 187010 DEBUG nova.network.neutron [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.599 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Start _get_guest_xml network_info=[{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.605 187010 WARNING nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.613 187010 DEBUG nova.virt.libvirt.host [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.613 187010 DEBUG nova.virt.libvirt.host [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.616 187010 DEBUG nova.virt.libvirt.host [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.617 187010 DEBUG nova.virt.libvirt.host [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.617 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.617 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.618 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.618 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.618 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.619 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.619 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.619 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.619 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.620 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.620 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.620 187010 DEBUG nova.virt.hardware [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.624 187010 DEBUG nova.virt.libvirt.vif [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:53:38Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.625 187010 DEBUG nova.network.os_vif_util [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.626 187010 DEBUG nova.network.os_vif_util [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.627 187010 DEBUG nova.objects.instance [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.643 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <uuid>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</uuid>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <name>instance-00000003</name>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:53:42</nova:creationTime>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="serial">3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="uuid">3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:8b:0c:68"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <target dev="tap03b48f25-c3"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log" append="off"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:53:42 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:53:42 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:53:42 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:53:42 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.644 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Preparing to wait for external event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.644 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.644 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.644 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.645 187010 DEBUG nova.virt.libvirt.vif [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:53:38Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.645 187010 DEBUG nova.network.os_vif_util [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.646 187010 DEBUG nova.network.os_vif_util [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.646 187010 DEBUG os_vif [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.647 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.647 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.647 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.651 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.652 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03b48f25-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.652 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03b48f25-c3, col_values=(('external_ids', {'iface-id': '03b48f25-c36b-4787-921d-4208c750d11b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:0c:68', 'vm-uuid': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.654 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:42 np0005555140 NetworkManager[55531]: <info>  [1765446822.6552] manager: (tap03b48f25-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.657 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.663 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.664 187010 INFO os_vif [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3')#033[00m
Dec 11 04:53:42 np0005555140 podman[214317]: 2025-12-11 09:53:42.7037053 +0000 UTC m=+0.066817908 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 11 04:53:42 np0005555140 podman[214316]: 2025-12-11 09:53:42.725676614 +0000 UTC m=+0.093616579 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.934 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.935 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.935 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:8b:0c:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:53:42 np0005555140 nova_compute[187006]: 2025-12-11 09:53:42.936 187010 INFO nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Using config drive#033[00m
Dec 11 04:53:43 np0005555140 nova_compute[187006]: 2025-12-11 09:53:43.729 187010 INFO nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Creating config drive at /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config#033[00m
Dec 11 04:53:43 np0005555140 nova_compute[187006]: 2025-12-11 09:53:43.735 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjj89qukk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:43 np0005555140 nova_compute[187006]: 2025-12-11 09:53:43.863 187010 DEBUG oslo_concurrency.processutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjj89qukk" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:43 np0005555140 kernel: tap03b48f25-c3: entered promiscuous mode
Dec 11 04:53:43 np0005555140 NetworkManager[55531]: <info>  [1765446823.9317] manager: (tap03b48f25-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec 11 04:53:43 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:43Z|00046|binding|INFO|Claiming lport 03b48f25-c36b-4787-921d-4208c750d11b for this chassis.
Dec 11 04:53:43 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:43Z|00047|binding|INFO|03b48f25-c36b-4787-921d-4208c750d11b: Claiming fa:16:3e:8b:0c:68 10.100.0.6
Dec 11 04:53:43 np0005555140 nova_compute[187006]: 2025-12-11 09:53:43.932 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.954 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:0c:68 10.100.0.6'], port_security=['fa:16:3e:8b:0c:68 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c4077e4b-5e35-4f12-851b-3c078e730784', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be1a9aae-13dd-4cf1-babc-e8afc3c22735, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=03b48f25-c36b-4787-921d-4208c750d11b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.955 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 03b48f25-c36b-4787-921d-4208c750d11b in datapath dd4b248c-547e-4f62-84f1-a16f3c8c8d18 bound to our chassis#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.957 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd4b248c-547e-4f62-84f1-a16f3c8c8d18#033[00m
Dec 11 04:53:43 np0005555140 systemd-udevd[214381]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.974 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[776095be-6f9d-4b69-bad3-b969e6e3da04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.975 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd4b248c-51 in ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.978 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd4b248c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.978 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[173e5fd2-3fd0-455c-af06-8f573de2b376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.980 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5f509b7c-ecd4-4e70-8b8a-0ca5af8463e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:43 np0005555140 NetworkManager[55531]: <info>  [1765446823.9837] device (tap03b48f25-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:53:43 np0005555140 NetworkManager[55531]: <info>  [1765446823.9843] device (tap03b48f25-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:53:43 np0005555140 systemd-machined[153398]: New machine qemu-3-instance-00000003.
Dec 11 04:53:43 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:43.991 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[6a94fc8a-7805-4cdf-b877-15f7c8a2af75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:44Z|00048|binding|INFO|Setting lport 03b48f25-c36b-4787-921d-4208c750d11b ovn-installed in OVS
Dec 11 04:53:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:44Z|00049|binding|INFO|Setting lport 03b48f25-c36b-4787-921d-4208c750d11b up in Southbound
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.013 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:44 np0005555140 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.026 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7072a189-e1dd-4223-b54f-8c45eed4df38]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.059 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[6efc21da-e96f-4a97-9687-d903181a78ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 NetworkManager[55531]: <info>  [1765446824.0658] manager: (tapdd4b248c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.066 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2084dbde-d011-4841-87b7-3939fcb121c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.103 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1fd928-9083-44a6-8ff0-6a585ff6a0de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.108 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[8e866e19-1624-4fb6-ac27-c8bd34f3540c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 NetworkManager[55531]: <info>  [1765446824.1319] device (tapdd4b248c-50): carrier: link connected
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.137 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c03d14-5f0a-4fb1-be78-d4949fab9da4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.155 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9f9353-e0fe-4989-a8d3-115199f57089]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd4b248c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:5b:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 318270, 'reachable_time': 22827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214416, 'error': None, 'target': 'ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.179 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[41dfd2c6-f13b-447c-9ab0-89e7d80d2660]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:5b4e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 318270, 'tstamp': 318270}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214418, 'error': None, 'target': 'ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.197 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[06e10104-ea56-4cbd-b924-3d8087f0bd21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd4b248c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:5b:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 318270, 'reachable_time': 22827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214419, 'error': None, 'target': 'ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.230 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d65da1-cdc2-44c6-9b23-87e96877962e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.295 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[94a60cfa-49a4-413d-9467-b817bf2dabb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.297 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd4b248c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.297 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.298 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd4b248c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:44 np0005555140 kernel: tapdd4b248c-50: entered promiscuous mode
Dec 11 04:53:44 np0005555140 NetworkManager[55531]: <info>  [1765446824.3001] manager: (tapdd4b248c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.299 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.303 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd4b248c-50, col_values=(('external_ids', {'iface-id': 'a5ae34d9-5f49-49f6-8020-4ec1e4b97b26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.304 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.304 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:44Z|00050|binding|INFO|Releasing lport a5ae34d9-5f49-49f6-8020-4ec1e4b97b26 from this chassis (sb_readonly=0)
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.306 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd4b248c-547e-4f62-84f1-a16f3c8c8d18.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd4b248c-547e-4f62-84f1-a16f3c8c8d18.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.307 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[daace250-aaf8-488e-8c17-396eb5a9f3f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.308 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-dd4b248c-547e-4f62-84f1-a16f3c8c8d18
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/dd4b248c-547e-4f62-84f1-a16f3c8c8d18.pid.haproxy
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID dd4b248c-547e-4f62-84f1-a16f3c8c8d18
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:53:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:44.310 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'env', 'PROCESS_TAG=haproxy-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd4b248c-547e-4f62-84f1-a16f3c8c8d18.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.315 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.617 187010 DEBUG nova.network.neutron [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updated VIF entry in instance network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.618 187010 DEBUG nova.network.neutron [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.640 187010 DEBUG oslo_concurrency.lockutils [req-bb4b7a92-e04b-4b96-adb3-b8f3058d62c0 req-8e3e1035-3a9d-4543-919b-ca8c48262934 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.773 187010 DEBUG nova.compute.manager [req-10dcaf5c-1089-4913-85d4-b22b43943506 req-6c3ad109-d0b7-468b-8d6f-0112eb51f6e8 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.773 187010 DEBUG oslo_concurrency.lockutils [req-10dcaf5c-1089-4913-85d4-b22b43943506 req-6c3ad109-d0b7-468b-8d6f-0112eb51f6e8 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.773 187010 DEBUG oslo_concurrency.lockutils [req-10dcaf5c-1089-4913-85d4-b22b43943506 req-6c3ad109-d0b7-468b-8d6f-0112eb51f6e8 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.774 187010 DEBUG oslo_concurrency.lockutils [req-10dcaf5c-1089-4913-85d4-b22b43943506 req-6c3ad109-d0b7-468b-8d6f-0112eb51f6e8 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.774 187010 DEBUG nova.compute.manager [req-10dcaf5c-1089-4913-85d4-b22b43943506 req-6c3ad109-d0b7-468b-8d6f-0112eb51f6e8 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Processing event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.861 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446824.861037, 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.862 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] VM Started (Lifecycle Event)#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.864 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.884 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.889 187010 INFO nova.virt.libvirt.driver [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Instance spawned successfully.#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.889 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.891 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.894 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.915 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.915 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.916 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.916 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.917 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.917 187010 DEBUG nova.virt.libvirt.driver [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.921 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.922 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446824.86197, 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.922 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.951 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.955 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446824.8669517, 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.956 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.980 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.984 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:53:44 np0005555140 podman[214457]: 2025-12-11 09:53:44.890012542 +0000 UTC m=+0.041854869 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.989 187010 INFO nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Took 6.54 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:53:44 np0005555140 nova_compute[187006]: 2025-12-11 09:53:44.989 187010 DEBUG nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:53:45 np0005555140 nova_compute[187006]: 2025-12-11 09:53:45.014 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:53:45 np0005555140 nova_compute[187006]: 2025-12-11 09:53:45.054 187010 INFO nova.compute.manager [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Took 7.07 seconds to build instance.#033[00m
Dec 11 04:53:45 np0005555140 nova_compute[187006]: 2025-12-11 09:53:45.072 187010 DEBUG oslo_concurrency.lockutils [None req-95e3511f-cd71-44b7-b915-b6885b3dc57e 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:45 np0005555140 podman[214457]: 2025-12-11 09:53:45.483608378 +0000 UTC m=+0.635450685 container create cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 11 04:53:45 np0005555140 systemd[1]: Started libpod-conmon-cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05.scope.
Dec 11 04:53:45 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:53:45 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b00141b03919a2dee7108b909db5c988fa9831adf0d62397dfe0fe48b4a5c07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:53:45 np0005555140 podman[214457]: 2025-12-11 09:53:45.934766509 +0000 UTC m=+1.086608916 container init cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:53:45 np0005555140 podman[214457]: 2025-12-11 09:53:45.941220233 +0000 UTC m=+1.093062540 container start cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:53:45 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [NOTICE]   (214476) : New worker (214478) forked
Dec 11 04:53:45 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [NOTICE]   (214476) : Loading success.
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.689 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.852 187010 DEBUG nova.compute.manager [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.853 187010 DEBUG oslo_concurrency.lockutils [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.853 187010 DEBUG oslo_concurrency.lockutils [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.854 187010 DEBUG oslo_concurrency.lockutils [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.854 187010 DEBUG nova.compute.manager [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:53:46 np0005555140 nova_compute[187006]: 2025-12-11 09:53:46.854 187010 WARNING nova.compute.manager [req-8801b4d4-2fbe-4971-8410-5c3cd4e85bdc req-333df624-831e-4d00-9238-e9ecc69fe025 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b for instance with vm_state active and task_state None.#033[00m
Dec 11 04:53:47 np0005555140 nova_compute[187006]: 2025-12-11 09:53:47.654 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:47 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:47.791 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:53:48 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:48Z|00051|binding|INFO|Releasing lport a5ae34d9-5f49-49f6-8020-4ec1e4b97b26 from this chassis (sb_readonly=0)
Dec 11 04:53:48 np0005555140 NetworkManager[55531]: <info>  [1765446828.4518] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec 11 04:53:48 np0005555140 NetworkManager[55531]: <info>  [1765446828.4530] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.453 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:48 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:48Z|00052|binding|INFO|Releasing lport a5ae34d9-5f49-49f6-8020-4ec1e4b97b26 from this chassis (sb_readonly=0)
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.481 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.487 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:48.617 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:48.618 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:53:48.619 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.687 187010 DEBUG nova.compute.manager [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-changed-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.687 187010 DEBUG nova.compute.manager [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing instance network info cache due to event network-changed-03b48f25-c36b-4787-921d-4208c750d11b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.688 187010 DEBUG oslo_concurrency.lockutils [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.688 187010 DEBUG oslo_concurrency.lockutils [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:53:48 np0005555140 nova_compute[187006]: 2025-12-11 09:53:48.688 187010 DEBUG nova.network.neutron [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:53:49 np0005555140 nova_compute[187006]: 2025-12-11 09:53:49.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.103 187010 DEBUG nova.network.neutron [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updated VIF entry in instance network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.103 187010 DEBUG nova.network.neutron [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.125 187010 DEBUG oslo_concurrency.lockutils [req-8dd755bb-b514-4d81-8409-aad531dbcfaa req-79161487-17a4-48ab-bdb9-f45be1156f0d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.531 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3e51cafe3216f130b4ed84cc27d5f9d91ed6dcf88509a4458db4d10571ef13cf" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.608 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 11 Dec 2025 09:53:50 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-532b6776-ca20-4f92-bf18-7c993364f812 x-openstack-request-id: req-532b6776-ca20-4f92-bf18-7c993364f812 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.608 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "21dd3019-ee53-4a0d-bfc3-73b12fb409db", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/21dd3019-ee53-4a0d-bfc3-73b12fb409db"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/21dd3019-ee53-4a0d-bfc3-73b12fb409db"}]}, {"id": "8ceb5bb7-cd53-4ae6-a352-a5023850ca5b", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.608 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-532b6776-ca20-4f92-bf18-7c993364f812 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.611 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}3e51cafe3216f130b4ed84cc27d5f9d91ed6dcf88509a4458db4d10571ef13cf" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.687 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 11 Dec 2025 09:53:50 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-8909917a-8f4e-4b17-b2fa-2a995c3c2536 x-openstack-request-id: req-8909917a-8f4e-4b17-b2fa-2a995c3c2536 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.687 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8ceb5bb7-cd53-4ae6-a352-a5023850ca5b", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.687 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/8ceb5bb7-cd53-4ae6-a352-a5023850ca5b used request id req-8909917a-8f4e-4b17-b2fa-2a995c3c2536 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.689 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'name': 'tempest-TestNetworkBasicOps-server-1741934441', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.694 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 / tap03b48f25-c3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.694 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebf34a13-1d74-4a0d-8d3b-dc748a8c4e01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.689869', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4ba39b44-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'b370aa96619694ece9175ecee406e69d43cf9dc5c4f7629517104b1dcf28e75e'}]}, 'timestamp': '2025-12-11 09:53:50.695538', '_unique_id': 'eb060eae589f48eeae8b214de94d126b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.701 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.731 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.latency volume: 189319936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.732 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.latency volume: 843874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aeca255-00e9-449d-b94e-e0ec174206d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 189319936, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.705397', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba93342-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '5a430670fee32369be0ae85b9c8b0bfe7951abc1eb10965432ebdc2476037f8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 843874, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.705397', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba942e2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '2cedebd4ddf83d530384e5e92f1c9ece1437c64f3813d45fcfb706eb80ad3b09'}]}, 'timestamp': '2025-12-11 09:53:50.732302', '_unique_id': '91225aa17d5c4d89bd79048e6e62e6b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.733 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd88fe98e-f390-4c85-ac62-30561bac8920', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.735080', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4ba9baf6-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': '2fcb46632eb4a2909ce496cdcd0f6aa7f1cba82ca23465b4199087d2945f55e4'}]}, 'timestamp': '2025-12-11 09:53:50.735372', '_unique_id': '918057992e7845298b5af812332818c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.735 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.736 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.736 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>]
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.737 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.749 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.749 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1525ab96-af17-4966-bcb3-67081e309ddc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.737161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4babe25e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': '0e5d63f1635653a90a77b19b73f70be8979d3771b7a12fb1a35a6fafe9152e25'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.737161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4babeefc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': '4eda5dae35e6595c46709617c1877a6ecbf464f42c8429009f2ef58c6890ff80'}]}, 'timestamp': '2025-12-11 09:53:50.749789', '_unique_id': '4543d558b5be47c7a49cba3370c66b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.751 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5568037-0874-4a0e-91a8-d4f644878e6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.752117', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bac53ec-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'a472dcee2dc66d032b5609a34404fc788e551f805207c254628acecdb9388514'}]}, 'timestamp': '2025-12-11 09:53:50.752369', '_unique_id': '4545701e9aec4b1f89effccce85f9eac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.752 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.753 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.753 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98128995-9fe8-4649-b1c9-ffd347055f51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.753513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bac89d4-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': '1a4f40d18562d48dc033aff8376659304fb93adf6a64c63c8bf519facb49baa0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.753513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bac9208-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': '8244de20ea722f750b81b44bb3cd58dab1c0f14a42c63a657505fed34e1dec41'}]}, 'timestamp': '2025-12-11 09:53:50.753969', '_unique_id': '08a86cd22567406692cda8b080dbff37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.754 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.757 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.757 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>]
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.757 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6106e408-ecf8-4ee8-85bd-7ba196622e0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.757293', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bad1de0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'fa5dbd8c3660ccf548c6688e62f6d48cb9e9f6f8626fe3003ee70ea827b381b4'}]}, 'timestamp': '2025-12-11 09:53:50.757551', '_unique_id': '8f79b7ffde0e496f93e8fc475796ac70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.759 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.759 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>]
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.759 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30934bf-2917-4bfa-9ce4-beae05aed4f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.759322', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bad6cd2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': '1cfe7d62f29bf967d5acb5e42b4e2cfcdd8f38d024eeac389e2812a2c2979624'}]}, 'timestamp': '2025-12-11 09:53:50.759563', '_unique_id': '4ca078037b034b869d5d1b1c82aafc39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.760 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.762 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.782 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.783 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1: ceilometer.compute.pollsters.NoVolumeException
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.783 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '366709c9-c691-42eb-8fb1-cb2df3b5b164', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.783531', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bb12318-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'a14a3021601a459f39c1175e5df987f673833d256d3d3cbb48565b0c9e507b70'}]}, 'timestamp': '2025-12-11 09:53:50.784095', '_unique_id': '0dcac248af0c4e8aaf609df447f00603'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.785 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.786 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd562a0c8-d703-41fb-9f50-acb7404ed9d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.786792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bb1a04a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'd2765cd1b5510efa15a19af6c34e57894f044dcb5d8e1c66ab07212a49ba7248'}]}, 'timestamp': '2025-12-11 09:53:50.787137', '_unique_id': '048ef2e3b35f451da9ec9de3653dcedf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.787 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.788 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cd9dd0e-c4bf-45b6-bd49-dd5a70ae112d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.788533', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bb1e320-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': '276652a6ddf0a1e24fc73144fafe94381fb85338f5c915c27cfbbabbf3c3a41a'}]}, 'timestamp': '2025-12-11 09:53:50.788836', '_unique_id': '2a845d9cbe6a4039bc94302f10565c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.789 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.790 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '724de86e-e107-4add-b108-1c7cda35874c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.790195', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bb223bc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': '9a106ade2a5e08daa850e1408424004d52a9c4d7c2bda4ce2d5c95deffcd1382'}]}, 'timestamp': '2025-12-11 09:53:50.790486', '_unique_id': '54d3895f7cf24cbb9688e98962845cdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.791 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24bb3e23-6ab6-4cea-bbd7-75b8077c6d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000003-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-tap03b48f25-c3', 'timestamp': '2025-12-11T09:53:50.791884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'tap03b48f25-c3', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:0c:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03b48f25-c3'}, 'message_id': '4bb265fc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.315249892, 'message_signature': 'd2e04286857df9bd970674ec59cbbf1ccff3bc5a496a704d6060f3a61337f953'}]}, 'timestamp': '2025-12-11 09:53:50.792187', '_unique_id': 'c10fdbd3224d485ab102a5095062788c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.792 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.793 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.793 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/cpu volume: 5620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bced670-642d-4cf3-8555-917a258a29ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5620000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'timestamp': '2025-12-11T09:53:50.793603', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4bb2a94a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.407561844, 'message_signature': '96f4460f83b73ecf1704d354b100270ad1a8280e4ae17d41e8672732494c52ed'}]}, 'timestamp': '2025-12-11 09:53:50.793921', '_unique_id': 'c0ea620faa1d4a378a168a4b02391dd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.794 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.795 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.795 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0038897d-a271-4215-ab27-acf8bc945b28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.795309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb2ebee-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': 'ee08c52d24b3e2b8e3a9d3e63615f860f25447342c770cdf5907550db928e4c4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.795309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb2f652-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.362525385, 'message_signature': '6415b46a8dd3fe48a4ce659a400380ff37efd3d90146aa30e89925edcd027841'}]}, 'timestamp': '2025-12-11 09:53:50.795894', '_unique_id': 'a794b90c2fb241459f2594274164fe24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.796 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1741934441>]
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.797 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eb051d0-73ab-4703-ab72-87289b1130e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.797677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb34832-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '68e63b751ed2a3c5fdfd2abeba8c232016b8c1d0ff60f31e64beeb8f6e845be0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.797677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb3534a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': 'd4a62863eba03c040095df71277b2388e02f1d39a7094f15baf0ac8a729b8bb2'}]}, 'timestamp': '2025-12-11 09:53:50.798242', '_unique_id': '15b8f01025b1436cb3ffcf028f6c54bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.798 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.799 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.799 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e70fe45a-1703-4e8d-9cd1-59c0d2ef2d8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.799642', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb394c2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '2d8f840647c548d5c2673e40aa18176aa09d94763e7f408ae0cd4b8c222f7258'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.799642', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb39ff8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '9dc34ca91975aceabadc576f3fbba118988841563644bcba175d58b53af36a15'}]}, 'timestamp': '2025-12-11 09:53:50.800207', '_unique_id': '78b147cc16b141c5973d3e3356c4e2b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.800 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.801 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.801 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cface6e0-1313-43bf-a267-1d45c9d17c59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.801594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb3e0ee-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': 'a41ffa07a5d1b0b38c5c00d6474d4bc562a71959b1eed0108dec35594a01a71a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.801594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb3eddc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '464608094dae0c32c2a870f8f9f08b48290109bd6c0787b70fecefc8f3203da0'}]}, 'timestamp': '2025-12-11 09:53:50.802200', '_unique_id': 'd0a454f61c824b5eb6185c6b6f860855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.802 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.803 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.803 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd538ce9-2988-40cf-9fc8-6e2969dd40a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.803603', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb42f5e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '5866fec1407be353c80b4c2e55f6bb8e1c826316cd499aed162008f0bc081576'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.803603', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb43a6c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '18fe24e334b5ebb2caf4fa0c02377c752bf2b0351cba28368931c01dfbdec6a4'}]}, 'timestamp': '2025-12-11 09:53:50.804159', '_unique_id': '3523f3111a0b4936a7346e96927d9fdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.804 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.805 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.805 12 DEBUG ceilometer.compute.pollsters [-] 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b1b2c5a-2e51-4881-beab-6f5e4793010b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-vda', 'timestamp': '2025-12-11T09:53:50.805539', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4bb47b44-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': 'a261c311fbb15c4ac3f2d65d32b2c8338b38f5cdb6495d6cef40c79097b5dd1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-sda', 'timestamp': '2025-12-11T09:53:50.805539', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1741934441', 'name': 'instance-00000003', 'instance_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4bb4868e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3189.330774683, 'message_signature': '8fe6593583a1a5b5854e38c63634628e5f179b482388960b00c70bcf42c8de34'}]}, 'timestamp': '2025-12-11 09:53:50.806113', '_unique_id': 'b8cbe301bc2644b888b2740a2e512a3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:53:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:53:50.806 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:53:50 np0005555140 nova_compute[187006]: 2025-12-11 09:53:50.869 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:53:51 np0005555140 nova_compute[187006]: 2025-12-11 09:53:51.691 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:51 np0005555140 nova_compute[187006]: 2025-12-11 09:53:51.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:51 np0005555140 nova_compute[187006]: 2025-12-11 09:53:51.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:52 np0005555140 nova_compute[187006]: 2025-12-11 09:53:52.657 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:52 np0005555140 podman[214488]: 2025-12-11 09:53:52.709809535 +0000 UTC m=+0.076167304 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:53:52 np0005555140 nova_compute[187006]: 2025-12-11 09:53:52.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:52 np0005555140 nova_compute[187006]: 2025-12-11 09:53:52.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:53:52 np0005555140 nova_compute[187006]: 2025-12-11 09:53:52.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:53 np0005555140 nova_compute[187006]: 2025-12-11 09:53:53.864 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:53 np0005555140 nova_compute[187006]: 2025-12-11 09:53:53.864 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:53 np0005555140 nova_compute[187006]: 2025-12-11 09:53:53.865 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:53 np0005555140 nova_compute[187006]: 2025-12-11 09:53:53.865 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:53:53 np0005555140 nova_compute[187006]: 2025-12-11 09:53:53.937 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.019 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.021 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.104 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.255 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.257 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5583MB free_disk=73.33189392089844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.258 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.258 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.338 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.338 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.338 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.387 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.408 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.429 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:53:54 np0005555140 nova_compute[187006]: 2025-12-11 09:53:54.430 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:53:55 np0005555140 nova_compute[187006]: 2025-12-11 09:53:55.426 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:55 np0005555140 nova_compute[187006]: 2025-12-11 09:53:55.426 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:55 np0005555140 nova_compute[187006]: 2025-12-11 09:53:55.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:53:56 np0005555140 nova_compute[187006]: 2025-12-11 09:53:56.694 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:57 np0005555140 nova_compute[187006]: 2025-12-11 09:53:57.659 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:53:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:58Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:0c:68 10.100.0.6
Dec 11 04:53:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:53:58Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:0c:68 10.100.0.6
Dec 11 04:54:00 np0005555140 podman[214535]: 2025-12-11 09:54:00.679536827 +0000 UTC m=+0.054295424 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 04:54:00 np0005555140 podman[214536]: 2025-12-11 09:54:00.713876297 +0000 UTC m=+0.086410233 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 04:54:01 np0005555140 nova_compute[187006]: 2025-12-11 09:54:01.696 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:02 np0005555140 nova_compute[187006]: 2025-12-11 09:54:02.662 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:05 np0005555140 nova_compute[187006]: 2025-12-11 09:54:05.182 187010 INFO nova.compute.manager [None req-e4574ca3-9c8b-493c-96ec-d70a6ddfc033 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Get console output#033[00m
Dec 11 04:54:05 np0005555140 nova_compute[187006]: 2025-12-11 09:54:05.189 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:54:05 np0005555140 podman[214577]: 2025-12-11 09:54:05.675804552 +0000 UTC m=+0.050878939 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 04:54:06 np0005555140 nova_compute[187006]: 2025-12-11 09:54:06.699 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:07 np0005555140 nova_compute[187006]: 2025-12-11 09:54:07.665 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:09 np0005555140 podman[214596]: 2025-12-11 09:54:09.674762809 +0000 UTC m=+0.053596165 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.239 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.240 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.240 187010 DEBUG nova.objects.instance [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'flavor' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.592 187010 DEBUG nova.objects.instance [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_requests' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.613 187010 DEBUG nova.network.neutron [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:54:11 np0005555140 nova_compute[187006]: 2025-12-11 09:54:11.700 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:12 np0005555140 nova_compute[187006]: 2025-12-11 09:54:12.632 187010 DEBUG nova.policy [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:54:12 np0005555140 nova_compute[187006]: 2025-12-11 09:54:12.668 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:13 np0005555140 podman[214622]: 2025-12-11 09:54:13.694847293 +0000 UTC m=+0.061591156 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Dec 11 04:54:13 np0005555140 podman[214621]: 2025-12-11 09:54:13.725576583 +0000 UTC m=+0.094613760 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:54:14 np0005555140 nova_compute[187006]: 2025-12-11 09:54:14.773 187010 DEBUG nova.network.neutron [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Successfully created port: 9137b85f-17eb-4f40-8e38-0663d37e4b82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:54:16 np0005555140 nova_compute[187006]: 2025-12-11 09:54:16.702 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.581 187010 DEBUG nova.network.neutron [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Successfully updated port: 9137b85f-17eb-4f40-8e38-0663d37e4b82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.604 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.605 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.605 187010 DEBUG nova.network.neutron [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.671 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.716 187010 DEBUG nova.compute.manager [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-changed-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.717 187010 DEBUG nova.compute.manager [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing instance network info cache due to event network-changed-9137b85f-17eb-4f40-8e38-0663d37e4b82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:54:17 np0005555140 nova_compute[187006]: 2025-12-11 09:54:17.717 187010 DEBUG oslo_concurrency.lockutils [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.824 187010 DEBUG nova.network.neutron [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.841 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.842 187010 DEBUG oslo_concurrency.lockutils [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.843 187010 DEBUG nova.network.neutron [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing network info cache for port 9137b85f-17eb-4f40-8e38-0663d37e4b82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.846 187010 DEBUG nova.virt.libvirt.vif [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.846 187010 DEBUG nova.network.os_vif_util [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.847 187010 DEBUG nova.network.os_vif_util [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.847 187010 DEBUG os_vif [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.847 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.848 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.848 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.851 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.851 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9137b85f-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.852 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9137b85f-17, col_values=(('external_ids', {'iface-id': '9137b85f-17eb-4f40-8e38-0663d37e4b82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:b3:6e', 'vm-uuid': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.853 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.854 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:54:20 np0005555140 NetworkManager[55531]: <info>  [1765446860.8550] manager: (tap9137b85f-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.861 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.863 187010 INFO os_vif [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17')#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.864 187010 DEBUG nova.virt.libvirt.vif [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.864 187010 DEBUG nova.network.os_vif_util [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.865 187010 DEBUG nova.network.os_vif_util [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.867 187010 DEBUG nova.virt.libvirt.guest [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] attach device xml: <interface type="ethernet">
Dec 11 04:54:20 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:9e:b3:6e"/>
Dec 11 04:54:20 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:54:20 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:54:20 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:54:20 np0005555140 nova_compute[187006]:  <target dev="tap9137b85f-17"/>
Dec 11 04:54:20 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:54:20 np0005555140 nova_compute[187006]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec 11 04:54:20 np0005555140 kernel: tap9137b85f-17: entered promiscuous mode
Dec 11 04:54:20 np0005555140 NetworkManager[55531]: <info>  [1765446860.8785] manager: (tap9137b85f-17): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec 11 04:54:20 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:20Z|00053|binding|INFO|Claiming lport 9137b85f-17eb-4f40-8e38-0663d37e4b82 for this chassis.
Dec 11 04:54:20 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:20Z|00054|binding|INFO|9137b85f-17eb-4f40-8e38-0663d37e4b82: Claiming fa:16:3e:9e:b3:6e 10.100.0.25
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.879 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.887 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:b3:6e 10.100.0.25'], port_security=['fa:16:3e:9e:b3:6e 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9810c-dd88-422d-805c-da7658bbc958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23aa0fd8-b9ae-46c7-85ea-cfb4f786d11b, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9137b85f-17eb-4f40-8e38-0663d37e4b82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.888 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9137b85f-17eb-4f40-8e38-0663d37e4b82 in datapath eec9810c-dd88-422d-805c-da7658bbc958 bound to our chassis#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.889 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eec9810c-dd88-422d-805c-da7658bbc958#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.901 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8beb0f-c1cd-42d7-84ad-48bcd1cadfc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.902 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeec9810c-d1 in ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.904 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeec9810c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.904 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab7a7c8-c8ba-47d2-85fb-ca226dd10587]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.905 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6484486a-fbde-4e11-ab3a-b1b8d789fffb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 systemd-udevd[214674]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.916 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:20Z|00055|binding|INFO|Setting lport 9137b85f-17eb-4f40-8e38-0663d37e4b82 ovn-installed in OVS
Dec 11 04:54:20 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:20Z|00056|binding|INFO|Setting lport 9137b85f-17eb-4f40-8e38-0663d37e4b82 up in Southbound
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.918 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.921 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[f9af5871-0f9e-4d8b-a4c8-53a63ac2b846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 NetworkManager[55531]: <info>  [1765446860.9221] device (tap9137b85f-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:54:20 np0005555140 NetworkManager[55531]: <info>  [1765446860.9226] device (tap9137b85f-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.945 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[af73b40b-3467-44be-9bc6-e1e363515310]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.975 187010 DEBUG nova.virt.libvirt.driver [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.976 187010 DEBUG nova.virt.libvirt.driver [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.975 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[991dfa0e-2ae1-40d1-aa9c-3747cfd2668a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.976 187010 DEBUG nova.virt.libvirt.driver [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:8b:0c:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:54:20 np0005555140 nova_compute[187006]: 2025-12-11 09:54:20.976 187010 DEBUG nova.virt.libvirt.driver [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:9e:b3:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:54:20 np0005555140 systemd-udevd[214677]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:54:20 np0005555140 NetworkManager[55531]: <info>  [1765446860.9813] manager: (tapeec9810c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec 11 04:54:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:20.981 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[741d8bce-ca11-489b-a222-b7cd8c3a1d4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.009 187010 DEBUG nova.virt.libvirt.guest [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:21</nova:creationTime>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:21 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    <nova:port uuid="9137b85f-17eb-4f40-8e38-0663d37e4b82">
Dec 11 04:54:21 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:21 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:21 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:21 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.016 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[e2bdd61e-675d-4cf2-a4de-9d715e28a348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.019 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[be60cbab-c770-49b7-be72-919464b9ba5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.034 187010 DEBUG oslo_concurrency.lockutils [None req-4d638750-3c1d-40ff-8a6a-ec37de8ac82a 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:21 np0005555140 NetworkManager[55531]: <info>  [1765446861.0425] device (tapeec9810c-d0): carrier: link connected
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.046 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c1c692-4217-4aae-a15f-ab8f067c4b13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.065 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a9b5a9-8f8d-4066-8ebc-656ea6de863f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeec9810c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:86:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321961, 'reachable_time': 32197, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214700, 'error': None, 'target': 'ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.083 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cf54a266-67c1-4bf7-943c-a923d499a6e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:86d0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 321961, 'tstamp': 321961}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214701, 'error': None, 'target': 'ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.102 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0af99e15-3f09-428e-abda-fdf61707cc5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeec9810c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:86:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321961, 'reachable_time': 32197, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214702, 'error': None, 'target': 'ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.137 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c60d41-f5d5-4d7c-80f0-e6a65acb9302]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.156 187010 DEBUG nova.compute.manager [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.157 187010 DEBUG oslo_concurrency.lockutils [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.158 187010 DEBUG oslo_concurrency.lockutils [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.158 187010 DEBUG oslo_concurrency.lockutils [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.159 187010 DEBUG nova.compute.manager [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.160 187010 WARNING nova.compute.manager [req-e4b24260-2496-46a4-a2f5-d8d383aca52a req-f17a3c7f-da10-4c13-97f1-9bf41075b1bf b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.216 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1e036754-b9d9-4170-b3fa-d86a29b8de64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.218 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeec9810c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.218 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.219 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeec9810c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:21 np0005555140 NetworkManager[55531]: <info>  [1765446861.2213] manager: (tapeec9810c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.221 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:21 np0005555140 kernel: tapeec9810c-d0: entered promiscuous mode
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.223 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeec9810c-d0, col_values=(('external_ids', {'iface-id': '7592da75-d34e-437d-916e-f52df6e8540e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:21 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:21Z|00057|binding|INFO|Releasing lport 7592da75-d34e-437d-916e-f52df6e8540e from this chassis (sb_readonly=0)
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.224 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.236 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.237 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eec9810c-dd88-422d-805c-da7658bbc958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eec9810c-dd88-422d-805c-da7658bbc958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.238 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f022cdfe-d497-479b-8f1b-dfc9ecfcb769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.239 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-eec9810c-dd88-422d-805c-da7658bbc958
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/eec9810c-dd88-422d-805c-da7658bbc958.pid.haproxy
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID eec9810c-dd88-422d-805c-da7658bbc958
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:54:21 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:21.240 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958', 'env', 'PROCESS_TAG=haproxy-eec9810c-dd88-422d-805c-da7658bbc958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eec9810c-dd88-422d-805c-da7658bbc958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:54:21 np0005555140 podman[214734]: 2025-12-11 09:54:21.630114973 +0000 UTC m=+0.060414854 container create 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:54:21 np0005555140 systemd[1]: Started libpod-conmon-5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae.scope.
Dec 11 04:54:21 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:54:21 np0005555140 podman[214734]: 2025-12-11 09:54:21.5982259 +0000 UTC m=+0.028525831 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:54:21 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8377aabdd7f916197db508837ccb8b0cc28740cb5cb84cb889d15901c18da1f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:54:21 np0005555140 nova_compute[187006]: 2025-12-11 09:54:21.704 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:21 np0005555140 podman[214734]: 2025-12-11 09:54:21.70804245 +0000 UTC m=+0.138342341 container init 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 11 04:54:21 np0005555140 podman[214734]: 2025-12-11 09:54:21.715041004 +0000 UTC m=+0.145340885 container start 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:54:21 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [NOTICE]   (214753) : New worker (214755) forked
Dec 11 04:54:21 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [NOTICE]   (214753) : Loading success.
Dec 11 04:54:22 np0005555140 nova_compute[187006]: 2025-12-11 09:54:22.134 187010 DEBUG nova.network.neutron [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updated VIF entry in instance network info cache for port 9137b85f-17eb-4f40-8e38-0663d37e4b82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:54:22 np0005555140 nova_compute[187006]: 2025-12-11 09:54:22.135 187010 DEBUG nova.network.neutron [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:22 np0005555140 nova_compute[187006]: 2025-12-11 09:54:22.152 187010 DEBUG oslo_concurrency.lockutils [req-9b26e76f-4d1f-4672-b2b2-c285ec4c4271 req-5d181607-9a00-439a-8140-7dd2901ceb3a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:54:22 np0005555140 nova_compute[187006]: 2025-12-11 09:54:22.987 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-9137b85f-17eb-4f40-8e38-0663d37e4b82" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:22 np0005555140 nova_compute[187006]: 2025-12-11 09:54:22.988 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-9137b85f-17eb-4f40-8e38-0663d37e4b82" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.015 187010 DEBUG nova.objects.instance [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'flavor' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.036 187010 DEBUG nova.virt.libvirt.vif [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.036 187010 DEBUG nova.network.os_vif_util [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.037 187010 DEBUG nova.network.os_vif_util [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.041 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.043 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.045 187010 DEBUG nova.virt.libvirt.driver [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Attempting to detach device tap9137b85f-17 from instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.045 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] detach device xml: <interface type="ethernet">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:9e:b3:6e"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <target dev="tap9137b85f-17"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.055 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.058 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <name>instance-00000003</name>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <uuid>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</uuid>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:21</nova:creationTime>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:port uuid="9137b85f-17eb-4f40-8e38-0663d37e4b82">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='serial'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='uuid'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk' index='2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config' index='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:8b:0c:68'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='tap03b48f25-c3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:9e:b3:6e'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='tap9137b85f-17'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='net1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c138,c572</label>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c572</imagelabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.060 187010 INFO nova.virt.libvirt.driver [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully detached device tap9137b85f-17 from instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 from the persistent domain config.#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.060 187010 DEBUG nova.virt.libvirt.driver [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] (1/8): Attempting to detach device tap9137b85f-17 with device alias net1 from instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.061 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] detach device xml: <interface type="ethernet">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:9e:b3:6e"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <target dev="tap9137b85f-17"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec 11 04:54:23 np0005555140 kernel: tap9137b85f-17 (unregistering): left promiscuous mode
Dec 11 04:54:23 np0005555140 NetworkManager[55531]: <info>  [1765446863.1613] device (tap9137b85f-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:54:23 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:23Z|00058|binding|INFO|Releasing lport 9137b85f-17eb-4f40-8e38-0663d37e4b82 from this chassis (sb_readonly=0)
Dec 11 04:54:23 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:23Z|00059|binding|INFO|Setting lport 9137b85f-17eb-4f40-8e38-0663d37e4b82 down in Southbound
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.165 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:23Z|00060|binding|INFO|Removing iface tap9137b85f-17 ovn-installed in OVS
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.168 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.174 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:b3:6e 10.100.0.25'], port_security=['fa:16:3e:9e:b3:6e 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9810c-dd88-422d-805c-da7658bbc958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23aa0fd8-b9ae-46c7-85ea-cfb4f786d11b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9137b85f-17eb-4f40-8e38-0663d37e4b82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.175 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9137b85f-17eb-4f40-8e38-0663d37e4b82 in datapath eec9810c-dd88-422d-805c-da7658bbc958 unbound from our chassis#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.177 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eec9810c-dd88-422d-805c-da7658bbc958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.177 187010 DEBUG nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Received event <DeviceRemovedEvent: 1765446863.1771169, 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.178 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9b630571-f141-4534-956f-a5fa38733972]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.178 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958 namespace which is not needed anymore#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.179 187010 DEBUG nova.virt.libvirt.driver [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Start waiting for the detach event from libvirt for device tap9137b85f-17 with device alias net1 for instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.179 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.185 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <name>instance-00000003</name>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <uuid>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</uuid>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:21</nova:creationTime>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:port uuid="9137b85f-17eb-4f40-8e38-0663d37e4b82">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='serial'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='uuid'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk' index='2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config' index='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:8b:0c:68'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target dev='tap03b48f25-c3'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c138,c572</label>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c572</imagelabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.186 187010 INFO nova.virt.libvirt.driver [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully detached device tap9137b85f-17 from instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 from the live domain config.#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.187 187010 DEBUG nova.virt.libvirt.vif [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.187 187010 DEBUG nova.network.os_vif_util [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.188 187010 DEBUG nova.network.os_vif_util [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.189 187010 DEBUG os_vif [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.191 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.192 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9137b85f-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.193 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.194 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.197 187010 INFO os_vif [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17')#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.198 187010 DEBUG nova.virt.libvirt.guest [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:23</nova:creationTime>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:23 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:23 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:23 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:54:23 np0005555140 podman[214767]: 2025-12-11 09:54:23.258751146 +0000 UTC m=+0.060395633 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.261 187010 DEBUG nova.compute.manager [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.261 187010 DEBUG oslo_concurrency.lockutils [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.262 187010 DEBUG oslo_concurrency.lockutils [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.262 187010 DEBUG oslo_concurrency.lockutils [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.262 187010 DEBUG nova.compute.manager [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.262 187010 WARNING nova.compute.manager [req-bffbd3e4-f18f-4c5d-a46b-56c33513d8f3 req-500da5cc-2dff-4767-af38-9977b19490db b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:54:23 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [NOTICE]   (214753) : haproxy version is 2.8.14-c23fe91
Dec 11 04:54:23 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [NOTICE]   (214753) : path to executable is /usr/sbin/haproxy
Dec 11 04:54:23 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [WARNING]  (214753) : Exiting Master process...
Dec 11 04:54:23 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [ALERT]    (214753) : Current worker (214755) exited with code 143 (Terminated)
Dec 11 04:54:23 np0005555140 neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958[214749]: [WARNING]  (214753) : All workers exited. Exiting... (0)
Dec 11 04:54:23 np0005555140 systemd[1]: libpod-5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae.scope: Deactivated successfully.
Dec 11 04:54:23 np0005555140 podman[214808]: 2025-12-11 09:54:23.311955498 +0000 UTC m=+0.045585502 container died 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 04:54:23 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae-userdata-shm.mount: Deactivated successfully.
Dec 11 04:54:23 np0005555140 systemd[1]: var-lib-containers-storage-overlay-8377aabdd7f916197db508837ccb8b0cc28740cb5cb84cb889d15901c18da1f2-merged.mount: Deactivated successfully.
Dec 11 04:54:23 np0005555140 podman[214808]: 2025-12-11 09:54:23.439992563 +0000 UTC m=+0.173622557 container cleanup 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:54:23 np0005555140 systemd[1]: libpod-conmon-5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae.scope: Deactivated successfully.
Dec 11 04:54:23 np0005555140 podman[214840]: 2025-12-11 09:54:23.504604421 +0000 UTC m=+0.042941289 container remove 5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.510 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9e86df97-b6e7-4735-9c08-c692777d6703]: (4, ('Thu Dec 11 09:54:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958 (5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae)\n5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae\nThu Dec 11 09:54:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958 (5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae)\n5f6ab1a4bc15ab05ea80a8827105ab988c895740e866cc5334ad6e406f68b3ae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.513 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1ae796-7b42-4e2c-9e66-6d0656bb6b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.514 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeec9810c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.516 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 kernel: tapeec9810c-d0: left promiscuous mode
Dec 11 04:54:23 np0005555140 nova_compute[187006]: 2025-12-11 09:54:23.529 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.532 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[8a73cb64-2574-4458-ae5e-1e233880dc98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.553 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[97425746-4f43-4084-ab87-c20ce3eeadb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.554 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[535df745-e9c4-4500-8e00-24cd4e74a94f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.571 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e8274cc0-1c0e-4321-b3c3-c43726b5c326]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321953, 'reachable_time': 36422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214855, 'error': None, 'target': 'ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.574 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eec9810c-dd88-422d-805c-da7658bbc958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:54:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:23.574 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[4e32c1ea-5a7e-420b-a02c-c429f64ceccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:23 np0005555140 systemd[1]: run-netns-ovnmeta\x2deec9810c\x2ddd88\x2d422d\x2d805c\x2dda7658bbc958.mount: Deactivated successfully.
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.347 187010 DEBUG nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-unplugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.348 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.348 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.348 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.349 187010 DEBUG nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-unplugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.349 187010 WARNING nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-unplugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.349 187010 DEBUG nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.349 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.350 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.351 187010 DEBUG oslo_concurrency.lockutils [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.351 187010 DEBUG nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.352 187010 WARNING nova.compute.manager [req-3910fd00-dcc7-49a6-803d-92e41826a991 req-8ff5280b-0ea2-47f7-8b9a-e1f38f8bdf8f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-plugged-9137b85f-17eb-4f40-8e38-0663d37e4b82 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.645 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.646 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:54:25 np0005555140 nova_compute[187006]: 2025-12-11 09:54:25.646 187010 DEBUG nova.network.neutron [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:54:26 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:26Z|00061|binding|INFO|Releasing lport a5ae34d9-5f49-49f6-8020-4ec1e4b97b26 from this chassis (sb_readonly=0)
Dec 11 04:54:26 np0005555140 nova_compute[187006]: 2025-12-11 09:54:26.584 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:26 np0005555140 nova_compute[187006]: 2025-12-11 09:54:26.705 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.412 187010 DEBUG nova.compute.manager [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-deleted-9137b85f-17eb-4f40-8e38-0663d37e4b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.412 187010 INFO nova.compute.manager [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Neutron deleted interface 9137b85f-17eb-4f40-8e38-0663d37e4b82; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.413 187010 DEBUG nova.network.neutron [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.440 187010 DEBUG nova.objects.instance [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lazy-loading 'system_metadata' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.489 187010 DEBUG nova.objects.instance [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lazy-loading 'flavor' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.531 187010 DEBUG nova.virt.libvirt.vif [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.532 187010 DEBUG nova.network.os_vif_util [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.533 187010 DEBUG nova.network.os_vif_util [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.535 187010 DEBUG nova.virt.libvirt.guest [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.538 187010 DEBUG nova.virt.libvirt.guest [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <name>instance-00000003</name>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <uuid>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</uuid>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:23</nova:creationTime>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='serial'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='uuid'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk' index='2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config' index='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:8b:0c:68'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='tap03b48f25-c3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c138,c572</label>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c572</imagelabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.538 187010 DEBUG nova.virt.libvirt.guest [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.542 187010 DEBUG nova.virt.libvirt.guest [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9e:b3:6e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9137b85f-17"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <name>instance-00000003</name>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <uuid>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</uuid>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:23</nova:creationTime>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='serial'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='uuid'>3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk' index='2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/disk.config' index='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:8b:0c:68'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target dev='tap03b48f25-c3'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1/console.log' append='off'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c138,c572</label>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c572</imagelabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.543 187010 WARNING nova.virt.libvirt.driver [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Detaching interface fa:16:3e:9e:b3:6e failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap9137b85f-17' not found.#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.544 187010 DEBUG nova.virt.libvirt.vif [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.544 187010 DEBUG nova.network.os_vif_util [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converting VIF {"id": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "address": "fa:16:3e:9e:b3:6e", "network": {"id": "eec9810c-dd88-422d-805c-da7658bbc958", "bridge": "br-int", "label": "tempest-network-smoke--2068774925", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9137b85f-17", "ovs_interfaceid": "9137b85f-17eb-4f40-8e38-0663d37e4b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.545 187010 DEBUG nova.network.os_vif_util [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.545 187010 DEBUG os_vif [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.547 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.547 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9137b85f-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.548 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.550 187010 INFO os_vif [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:b3:6e,bridge_name='br-int',has_traffic_filtering=True,id=9137b85f-17eb-4f40-8e38-0663d37e4b82,network=Network(eec9810c-dd88-422d-805c-da7658bbc958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9137b85f-17')#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.551 187010 DEBUG nova.virt.libvirt.guest [req-6fce1258-6b07-4394-a0a0-2c2f3109293d req-bc02de09-31c1-41e7-9188-3a04cc1ee412 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1741934441</nova:name>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:54:27</nova:creationTime>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    <nova:port uuid="03b48f25-c36b-4787-921d-4208c750d11b">
Dec 11 04:54:27 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:54:27 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:54:27 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.623 187010 INFO nova.network.neutron [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Port 9137b85f-17eb-4f40-8e38-0663d37e4b82 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.623 187010 DEBUG nova.network.neutron [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.645 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:54:27 np0005555140 nova_compute[187006]: 2025-12-11 09:54:27.673 187010 DEBUG oslo_concurrency.lockutils [None req-c79873a5-9283-419b-96a3-351504a6942c 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-9137b85f-17eb-4f40-8e38-0663d37e4b82" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.195 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.751 187010 DEBUG nova.compute.manager [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-changed-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.752 187010 DEBUG nova.compute.manager [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing instance network info cache due to event network-changed-03b48f25-c36b-4787-921d-4208c750d11b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.753 187010 DEBUG oslo_concurrency.lockutils [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.753 187010 DEBUG oslo_concurrency.lockutils [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.753 187010 DEBUG nova.network.neutron [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Refreshing network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.791 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.791 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.792 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.792 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.792 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.793 187010 INFO nova.compute.manager [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Terminating instance#033[00m
Dec 11 04:54:28 np0005555140 nova_compute[187006]: 2025-12-11 09:54:28.794 187010 DEBUG nova.compute.manager [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:54:29 np0005555140 kernel: tap03b48f25-c3 (unregistering): left promiscuous mode
Dec 11 04:54:29 np0005555140 NetworkManager[55531]: <info>  [1765446869.0318] device (tap03b48f25-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.034 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:29Z|00062|binding|INFO|Releasing lport 03b48f25-c36b-4787-921d-4208c750d11b from this chassis (sb_readonly=0)
Dec 11 04:54:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:29Z|00063|binding|INFO|Setting lport 03b48f25-c36b-4787-921d-4208c750d11b down in Southbound
Dec 11 04:54:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:54:29Z|00064|binding|INFO|Removing iface tap03b48f25-c3 ovn-installed in OVS
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.038 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:29.042 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:0c:68 10.100.0.6'], port_security=['fa:16:3e:8b:0c:68 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4077e4b-5e35-4f12-851b-3c078e730784', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be1a9aae-13dd-4cf1-babc-e8afc3c22735, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=03b48f25-c36b-4787-921d-4208c750d11b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:54:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:29.043 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 03b48f25-c36b-4787-921d-4208c750d11b in datapath dd4b248c-547e-4f62-84f1-a16f3c8c8d18 unbound from our chassis#033[00m
Dec 11 04:54:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:29.043 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd4b248c-547e-4f62-84f1-a16f3c8c8d18, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:54:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:29.044 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d06d26c3-2ae0-4de5-8049-56256788f74c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:29.045 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18 namespace which is not needed anymore#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.053 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 11 04:54:29 np0005555140 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 13.613s CPU time.
Dec 11 04:54:29 np0005555140 systemd-machined[153398]: Machine qemu-3-instance-00000003 terminated.
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.226 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.268 187010 INFO nova.virt.libvirt.driver [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Instance destroyed successfully.#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.269 187010 DEBUG nova.objects.instance [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.283 187010 DEBUG nova.virt.libvirt.vif [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:53:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1741934441',display_name='tempest-TestNetworkBasicOps-server-1741934441',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1741934441',id=3,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOIxBDJswjrrxXAC0bCQF79jWGvCHI5U4Lq4bFlZT/IHLsGbfoVPrgLlabluOS6sBoEu4KNyibS1ES/GfEfiYQxTCPdM2v6tfRPUajSg/E6ABTtmUgqG+2aYxoU+u+LAJg==',key_name='tempest-TestNetworkBasicOps-120074369',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:53:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-g21qazjo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:53:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.284 187010 DEBUG nova.network.os_vif_util [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.284 187010 DEBUG nova.network.os_vif_util [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.284 187010 DEBUG os_vif [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.285 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.286 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03b48f25-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.287 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.289 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.291 187010 INFO os_vif [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:0c:68,bridge_name='br-int',has_traffic_filtering=True,id=03b48f25-c36b-4787-921d-4208c750d11b,network=Network(dd4b248c-547e-4f62-84f1-a16f3c8c8d18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b48f25-c3')#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.291 187010 INFO nova.virt.libvirt.driver [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Deleting instance files /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1_del#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.292 187010 INFO nova.virt.libvirt.driver [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Deletion of /var/lib/nova/instances/3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1_del complete#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.336 187010 INFO nova.compute.manager [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.337 187010 DEBUG oslo.service.loopingcall [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.337 187010 DEBUG nova.compute.manager [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:54:29 np0005555140 nova_compute[187006]: 2025-12-11 09:54:29.337 187010 DEBUG nova.network.neutron [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [NOTICE]   (214476) : haproxy version is 2.8.14-c23fe91
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [NOTICE]   (214476) : path to executable is /usr/sbin/haproxy
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [WARNING]  (214476) : Exiting Master process...
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [WARNING]  (214476) : Exiting Master process...
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [ALERT]    (214476) : Current worker (214478) exited with code 143 (Terminated)
Dec 11 04:54:29 np0005555140 neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18[214472]: [WARNING]  (214476) : All workers exited. Exiting... (0)
Dec 11 04:54:29 np0005555140 systemd[1]: libpod-cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05.scope: Deactivated successfully.
Dec 11 04:54:29 np0005555140 podman[214880]: 2025-12-11 09:54:29.637150551 +0000 UTC m=+0.469706944 container died cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 04:54:30 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05-userdata-shm.mount: Deactivated successfully.
Dec 11 04:54:30 np0005555140 systemd[1]: var-lib-containers-storage-overlay-1b00141b03919a2dee7108b909db5c988fa9831adf0d62397dfe0fe48b4a5c07-merged.mount: Deactivated successfully.
Dec 11 04:54:30 np0005555140 podman[214880]: 2025-12-11 09:54:30.73311051 +0000 UTC m=+1.565666883 container cleanup cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:54:30 np0005555140 systemd[1]: libpod-conmon-cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05.scope: Deactivated successfully.
Dec 11 04:54:30 np0005555140 podman[214925]: 2025-12-11 09:54:30.917129982 +0000 UTC m=+0.159649049 container remove cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.922 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[91519c93-3c92-43b4-bc2b-5567aa17d140]: (4, ('Thu Dec 11 09:54:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18 (cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05)\ncb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05\nThu Dec 11 09:54:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18 (cb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05)\ncb8c01e7e41f5f8d45cb7ba0f5e70cc5eca973b3a2683d62b07c0c7654e62d05\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.926 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2563efa2-c3db-4c29-8284-081806e84541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.927 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd4b248c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.929 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:30 np0005555140 kernel: tapdd4b248c-50: left promiscuous mode
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.940 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.943 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[09f11ca3-ca6c-4e33-80bb-fff88f845a6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.948 187010 DEBUG nova.compute.manager [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-unplugged-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.948 187010 DEBUG oslo_concurrency.lockutils [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.948 187010 DEBUG oslo_concurrency.lockutils [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.949 187010 DEBUG oslo_concurrency.lockutils [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.949 187010 DEBUG nova.compute.manager [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-unplugged-03b48f25-c36b-4787-921d-4208c750d11b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:30 np0005555140 nova_compute[187006]: 2025-12-11 09:54:30.949 187010 DEBUG nova.compute.manager [req-f156c3b7-0925-4473-9d4d-e98695101c90 req-f7e0d9ab-dd64-4e67-ac82-1d4bbf9047d7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-unplugged-03b48f25-c36b-4787-921d-4208c750d11b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.958 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[149976bd-3ee0-4395-9b74-caf8357e48f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.959 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[eca5f9ae-cbd9-412b-9e0b-2d02ecbe87f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 podman[214937]: 2025-12-11 09:54:30.962121697 +0000 UTC m=+0.182318037 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.974 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb2b1d2-51bc-45c9-8b30-5aa204e6d599]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 318262, 'reachable_time': 25945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214979, 'error': None, 'target': 'ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 systemd[1]: run-netns-ovnmeta\x2ddd4b248c\x2d547e\x2d4f62\x2d84f1\x2da16f3c8c8d18.mount: Deactivated successfully.
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.978 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd4b248c-547e-4f62-84f1-a16f3c8c8d18 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:54:30 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:30.978 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c532ec-66a5-4d1b-8b75-2a277af03141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:54:30 np0005555140 podman[214926]: 2025-12-11 09:54:30.991038938 +0000 UTC m=+0.209817088 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.079 187010 DEBUG nova.network.neutron [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.098 187010 INFO nova.compute.manager [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Took 1.76 seconds to deallocate network for instance.#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.144 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.144 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.161 187010 DEBUG nova.compute.manager [req-7ee68692-8aca-43f3-97a0-a4023d5c8a93 req-296bbb0b-f0b8-44b7-8b8a-33ef8464cdd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-deleted-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.210 187010 DEBUG nova.compute.provider_tree [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.226 187010 DEBUG nova.scheduler.client.report [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.246 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.275 187010 INFO nova.scheduler.client.report [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.332 187010 DEBUG oslo_concurrency.lockutils [None req-52f04283-e8d2-4c81-be53-011e22705303 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.360 187010 DEBUG nova.network.neutron [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updated VIF entry in instance network info cache for port 03b48f25-c36b-4787-921d-4208c750d11b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.361 187010 DEBUG nova.network.neutron [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Updating instance_info_cache with network_info: [{"id": "03b48f25-c36b-4787-921d-4208c750d11b", "address": "fa:16:3e:8b:0c:68", "network": {"id": "dd4b248c-547e-4f62-84f1-a16f3c8c8d18", "bridge": "br-int", "label": "tempest-network-smoke--909919989", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b48f25-c3", "ovs_interfaceid": "03b48f25-c36b-4787-921d-4208c750d11b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.379 187010 DEBUG oslo_concurrency.lockutils [req-48a3e48a-7184-4173-8f3f-9a8c9f1d41af req-4e73418c-f0ca-417e-af06-662511864178 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:54:31 np0005555140 nova_compute[187006]: 2025-12-11 09:54:31.708 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.117 187010 DEBUG nova.compute.manager [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.117 187010 DEBUG oslo_concurrency.lockutils [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.118 187010 DEBUG oslo_concurrency.lockutils [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.118 187010 DEBUG oslo_concurrency.lockutils [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.118 187010 DEBUG nova.compute.manager [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] No waiting events found dispatching network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:54:33 np0005555140 nova_compute[187006]: 2025-12-11 09:54:33.118 187010 WARNING nova.compute.manager [req-f0ba7e24-3653-488d-a169-a4d9dcf61cc4 req-180fcf5d-0dd9-4470-9046-2ea2aa10a591 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Received unexpected event network-vif-plugged-03b48f25-c36b-4787-921d-4208c750d11b for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:54:34 np0005555140 nova_compute[187006]: 2025-12-11 09:54:34.289 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:35 np0005555140 nova_compute[187006]: 2025-12-11 09:54:35.170 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:35 np0005555140 nova_compute[187006]: 2025-12-11 09:54:35.243 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:36 np0005555140 nova_compute[187006]: 2025-12-11 09:54:36.709 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:36 np0005555140 podman[214981]: 2025-12-11 09:54:36.715723206 +0000 UTC m=+0.085420585 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:54:39 np0005555140 nova_compute[187006]: 2025-12-11 09:54:39.291 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:40.080 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:54:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:40.081 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:54:40 np0005555140 nova_compute[187006]: 2025-12-11 09:54:40.080 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:40 np0005555140 podman[215000]: 2025-12-11 09:54:40.679763177 +0000 UTC m=+0.060019782 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:54:41 np0005555140 nova_compute[187006]: 2025-12-11 09:54:41.711 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:44 np0005555140 nova_compute[187006]: 2025-12-11 09:54:44.267 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765446869.2658985, 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:54:44 np0005555140 nova_compute[187006]: 2025-12-11 09:54:44.268 187010 INFO nova.compute.manager [-] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:54:44 np0005555140 nova_compute[187006]: 2025-12-11 09:54:44.287 187010 DEBUG nova.compute.manager [None req-2ada730f-881e-4fc4-939f-ff55141bbcfd - - - - - -] [instance: 3c40e4ed-fea9-4e8d-8063-f6f6c870bdf1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:54:44 np0005555140 nova_compute[187006]: 2025-12-11 09:54:44.293 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:44 np0005555140 podman[215026]: 2025-12-11 09:54:44.703510261 +0000 UTC m=+0.066598755 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 04:54:44 np0005555140 podman[215025]: 2025-12-11 09:54:44.741148673 +0000 UTC m=+0.108099204 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 11 04:54:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:46.083 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:54:46 np0005555140 nova_compute[187006]: 2025-12-11 09:54:46.713 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:48.619 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:48.620 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:54:48.620 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:49 np0005555140 nova_compute[187006]: 2025-12-11 09:54:49.295 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:49 np0005555140 nova_compute[187006]: 2025-12-11 09:54:49.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:50 np0005555140 nova_compute[187006]: 2025-12-11 09:54:50.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:50 np0005555140 nova_compute[187006]: 2025-12-11 09:54:50.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:54:50 np0005555140 nova_compute[187006]: 2025-12-11 09:54:50.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:54:51 np0005555140 nova_compute[187006]: 2025-12-11 09:54:51.025 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:54:51 np0005555140 nova_compute[187006]: 2025-12-11 09:54:51.714 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:52 np0005555140 nova_compute[187006]: 2025-12-11 09:54:52.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:53 np0005555140 podman[215070]: 2025-12-11 09:54:53.685618629 +0000 UTC m=+0.060311311 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:54:53 np0005555140 nova_compute[187006]: 2025-12-11 09:54:53.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:53 np0005555140 nova_compute[187006]: 2025-12-11 09:54:53.920 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:53 np0005555140 nova_compute[187006]: 2025-12-11 09:54:53.921 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.026 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.026 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.026 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.027 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.210 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.212 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=73.32852554321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.212 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.212 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:54 np0005555140 nova_compute[187006]: 2025-12-11 09:54:54.298 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:56 np0005555140 nova_compute[187006]: 2025-12-11 09:54:56.716 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.008 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 73beb74f-4086-464a-a2ac-addc789a6c70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.008 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.008 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.045 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.060 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.067 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.068 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.085 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.086 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.087 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.151 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.151 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.157 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.157 187010 INFO nova.compute.claims [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.264 187010 DEBUG nova.compute.provider_tree [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.281 187010 DEBUG nova.scheduler.client.report [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.304 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.304 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.353 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.353 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.371 187010 INFO nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.388 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.471 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.473 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.473 187010 INFO nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Creating image(s)#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.474 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.474 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.475 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.488 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.561 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.562 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.563 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.576 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.637 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.639 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.675 187010 DEBUG nova.policy [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.679 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.680 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.680 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.770 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.771 187010 DEBUG nova.virt.disk.api [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.772 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.833 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.834 187010 DEBUG nova.virt.disk.api [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.835 187010 DEBUG nova.objects.instance [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 73beb74f-4086-464a-a2ac-addc789a6c70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.855 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.855 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Ensure instance console log exists: /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.856 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.856 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.857 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.993 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.994 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.994 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.995 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:54:57 np0005555140 nova_compute[187006]: 2025-12-11 09:54:57.995 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:54:58 np0005555140 nova_compute[187006]: 2025-12-11 09:54:58.344 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Successfully created port: 466d3171-bc29-4505-a82e-b17655482b51 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:54:59 np0005555140 nova_compute[187006]: 2025-12-11 09:54:59.299 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:54:59 np0005555140 nova_compute[187006]: 2025-12-11 09:54:59.746 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Successfully updated port: 466d3171-bc29-4505-a82e-b17655482b51 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.022 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.022 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.023 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.032 187010 DEBUG nova.compute.manager [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-changed-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.033 187010 DEBUG nova.compute.manager [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing instance network info cache due to event network-changed-466d3171-bc29-4505-a82e-b17655482b51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.033 187010 DEBUG oslo_concurrency.lockutils [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:00 np0005555140 nova_compute[187006]: 2025-12-11 09:55:00.651 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.664 187010 DEBUG nova.network.neutron [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:01 np0005555140 podman[215110]: 2025-12-11 09:55:01.687556415 +0000 UTC m=+0.062054145 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.688 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.688 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Instance network_info: |[{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.688 187010 DEBUG oslo_concurrency.lockutils [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.689 187010 DEBUG nova.network.neutron [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing network info cache for port 466d3171-bc29-4505-a82e-b17655482b51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:55:01 np0005555140 podman[215111]: 2025-12-11 09:55:01.692375853 +0000 UTC m=+0.060073759 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.692 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Start _get_guest_xml network_info=[{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.697 187010 WARNING nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.703 187010 DEBUG nova.virt.libvirt.host [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.704 187010 DEBUG nova.virt.libvirt.host [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.707 187010 DEBUG nova.virt.libvirt.host [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.708 187010 DEBUG nova.virt.libvirt.host [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.708 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.708 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.709 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.709 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.709 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.709 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.709 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.710 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.710 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.710 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.710 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.710 187010 DEBUG nova.virt.hardware [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.714 187010 DEBUG nova.virt.libvirt.vif [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1259410543',display_name='tempest-TestNetworkBasicOps-server-1259410543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1259410543',id=4,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONeMUed8EI2O0baeaJqPw83CgKBoj8BW4j9gYw/NtVLiRZG7S5SH8/gYx4O5Tj31WYQUncS3fWovBKgoJyY/mxtL19J3YMn/K4n7p4Y5xezpDxqNzH4AbNzYAMNJVh8+A==',key_name='tempest-TestNetworkBasicOps-2100690359',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-vvooq7ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:54:57Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=73beb74f-4086-464a-a2ac-addc789a6c70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.715 187010 DEBUG nova.network.os_vif_util [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.715 187010 DEBUG nova.network.os_vif_util [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.716 187010 DEBUG nova.objects.instance [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 73beb74f-4086-464a-a2ac-addc789a6c70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.718 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.733 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <uuid>73beb74f-4086-464a-a2ac-addc789a6c70</uuid>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <name>instance-00000004</name>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1259410543</nova:name>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:55:01</nova:creationTime>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        <nova:port uuid="466d3171-bc29-4505-a82e-b17655482b51">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="serial">73beb74f-4086-464a-a2ac-addc789a6c70</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="uuid">73beb74f-4086-464a-a2ac-addc789a6c70</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.config"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:03:38:b9"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <target dev="tap466d3171-bc"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/console.log" append="off"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:55:01 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:55:01 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:55:01 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:55:01 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.733 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Preparing to wait for external event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.734 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.734 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.734 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.735 187010 DEBUG nova.virt.libvirt.vif [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1259410543',display_name='tempest-TestNetworkBasicOps-server-1259410543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1259410543',id=4,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONeMUed8EI2O0baeaJqPw83CgKBoj8BW4j9gYw/NtVLiRZG7S5SH8/gYx4O5Tj31WYQUncS3fWovBKgoJyY/mxtL19J3YMn/K4n7p4Y5xezpDxqNzH4AbNzYAMNJVh8+A==',key_name='tempest-TestNetworkBasicOps-2100690359',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-vvooq7ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:54:57Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=73beb74f-4086-464a-a2ac-addc789a6c70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.735 187010 DEBUG nova.network.os_vif_util [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.736 187010 DEBUG nova.network.os_vif_util [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.736 187010 DEBUG os_vif [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.737 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.737 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.738 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.742 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.742 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap466d3171-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.743 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap466d3171-bc, col_values=(('external_ids', {'iface-id': '466d3171-bc29-4505-a82e-b17655482b51', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:38:b9', 'vm-uuid': '73beb74f-4086-464a-a2ac-addc789a6c70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.744 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:01 np0005555140 NetworkManager[55531]: <info>  [1765446901.7456] manager: (tap466d3171-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.746 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.753 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.755 187010 INFO os_vif [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc')#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.890 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.890 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.890 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:03:38:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:55:01 np0005555140 nova_compute[187006]: 2025-12-11 09:55:01.891 187010 INFO nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Using config drive#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.198 187010 INFO nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Creating config drive at /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.config#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.202 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_h_r6e61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.327 187010 DEBUG oslo_concurrency.processutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_h_r6e61" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.4053] manager: (tap466d3171-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Dec 11 04:55:02 np0005555140 kernel: tap466d3171-bc: entered promiscuous mode
Dec 11 04:55:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:02Z|00065|binding|INFO|Claiming lport 466d3171-bc29-4505-a82e-b17655482b51 for this chassis.
Dec 11 04:55:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:02Z|00066|binding|INFO|466d3171-bc29-4505-a82e-b17655482b51: Claiming fa:16:3e:03:38:b9 10.100.0.4
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.409 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.412 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.427 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:38:b9 10.100.0.4'], port_security=['fa:16:3e:03:38:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7dff39d4-6c24-429a-8738-9686bb4cece9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=579228a5-41f0-484d-8fb8-96b0a910160e, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=466d3171-bc29-4505-a82e-b17655482b51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.430 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 466d3171-bc29-4505-a82e-b17655482b51 in datapath 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf bound to our chassis#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.432 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.452 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ee36e397-5fcb-4fe0-90fc-23f9fbdec5e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.454 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap316d1cbf-31 in ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.463 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap316d1cbf-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.464 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[978332ba-6426-4015-be3b-bb23f1b6a8a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 systemd-udevd[215165]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.467 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[04674569-3a70-4a64-83e4-1ad4606a7a4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.470 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:02Z|00067|binding|INFO|Setting lport 466d3171-bc29-4505-a82e-b17655482b51 ovn-installed in OVS
Dec 11 04:55:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:02Z|00068|binding|INFO|Setting lport 466d3171-bc29-4505-a82e-b17655482b51 up in Southbound
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.480 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.481 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[81d15c43-0643-4d28-ab5d-d724bdc2247d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.4842] device (tap466d3171-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.4850] device (tap466d3171-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.496 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cd7fb6-1b6a-458e-9955-cdf0f070b287]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 systemd-machined[153398]: New machine qemu-4-instance-00000004.
Dec 11 04:55:02 np0005555140 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.529 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[1919100a-f6a0-490b-9eb8-206886ea5131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 systemd-udevd[215171]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.535 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[40af77f9-5444-471b-becb-1937d0bcf046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.5371] manager: (tap316d1cbf-30): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.574 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[da598ba6-c213-4e98-9306-56141a8dd457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.579 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[493a727e-405b-40b8-8a1c-d3edc2f1db82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.6050] device (tap316d1cbf-30): carrier: link connected
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.610 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[82975315-a955-4d20-90f3-e6fceffd7a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.626 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1dea3d93-6064-4c0c-aa58-4ed1e80392c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316d1cbf-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:9c:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326117, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215201, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.640 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[87bc9a42-44d6-412e-bbfe-c2ef6efc0e6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef8:9c44'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326117, 'tstamp': 326117}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215202, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.658 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1abc46-1cc1-4bd9-a884-cc48dbf10858]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316d1cbf-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:9c:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326117, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215203, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.690 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6173a7ed-e145-461b-a93b-fa483d57939c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.758 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b4c995-8857-4ba2-9945-85c95f29cdd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.760 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316d1cbf-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.760 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.761 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap316d1cbf-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.763 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 NetworkManager[55531]: <info>  [1765446902.7642] manager: (tap316d1cbf-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Dec 11 04:55:02 np0005555140 kernel: tap316d1cbf-30: entered promiscuous mode
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.766 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap316d1cbf-30, col_values=(('external_ids', {'iface-id': 'c8d19757-078a-4c1b-90f0-a2eb4ce2f692'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:02 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:02Z|00069|binding|INFO|Releasing lport c8d19757-078a-4c1b-90f0-a2eb4ce2f692 from this chassis (sb_readonly=0)
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.768 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.779 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.781 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/316d1cbf-3d8b-4adf-8c17-c7fde41c1daf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/316d1cbf-3d8b-4adf-8c17-c7fde41c1daf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.782 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0ecee4-cd0b-4189-ad30-dc65cdc062d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.783 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/316d1cbf-3d8b-4adf-8c17-c7fde41c1daf.pid.haproxy
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:55:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:02.784 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'env', 'PROCESS_TAG=haproxy-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/316d1cbf-3d8b-4adf-8c17-c7fde41c1daf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.828 187010 DEBUG nova.compute.manager [req-596974ca-b576-4f5c-85b9-7babf84aadf1 req-cbd54164-5553-44c8-9e55-d3b9120a8e15 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.828 187010 DEBUG oslo_concurrency.lockutils [req-596974ca-b576-4f5c-85b9-7babf84aadf1 req-cbd54164-5553-44c8-9e55-d3b9120a8e15 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.829 187010 DEBUG oslo_concurrency.lockutils [req-596974ca-b576-4f5c-85b9-7babf84aadf1 req-cbd54164-5553-44c8-9e55-d3b9120a8e15 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.829 187010 DEBUG oslo_concurrency.lockutils [req-596974ca-b576-4f5c-85b9-7babf84aadf1 req-cbd54164-5553-44c8-9e55-d3b9120a8e15 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:02 np0005555140 nova_compute[187006]: 2025-12-11 09:55:02.829 187010 DEBUG nova.compute.manager [req-596974ca-b576-4f5c-85b9-7babf84aadf1 req-cbd54164-5553-44c8-9e55-d3b9120a8e15 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Processing event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.108 187010 DEBUG nova.network.neutron [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updated VIF entry in instance network info cache for port 466d3171-bc29-4505-a82e-b17655482b51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.109 187010 DEBUG nova.network.neutron [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.129 187010 DEBUG oslo_concurrency.lockutils [req-b811d78a-dd05-4328-9a81-fa75b533c7b4 req-40e06fec-b1b5-4579-aca8-4d91b7ffbfe9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:03 np0005555140 podman[215234]: 2025-12-11 09:55:03.142417031 +0000 UTC m=+0.018901481 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.274 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446903.2745235, 73beb74f-4086-464a-a2ac-addc789a6c70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.275 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] VM Started (Lifecycle Event)#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.277 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.280 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.283 187010 INFO nova.virt.libvirt.driver [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Instance spawned successfully.#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.283 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:55:03 np0005555140 podman[215234]: 2025-12-11 09:55:03.486750229 +0000 UTC m=+0.363234679 container create 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 04:55:03 np0005555140 systemd[1]: Started libpod-conmon-884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2.scope.
Dec 11 04:55:03 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:55:03 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/647f1ed7a4743482092546565bb541ff52af18c0f91fc001b7101532b6c68d1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:55:03 np0005555140 podman[215234]: 2025-12-11 09:55:03.600519103 +0000 UTC m=+0.477003533 container init 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 11 04:55:03 np0005555140 podman[215234]: 2025-12-11 09:55:03.6091858 +0000 UTC m=+0.485670230 container start 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 04:55:03 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [NOTICE]   (215260) : New worker (215262) forked
Dec 11 04:55:03 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [NOTICE]   (215260) : Loading success.
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.735 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:03 np0005555140 nova_compute[187006]: 2025-12-11 09:55:03.740 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.330 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.331 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.331 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.332 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.332 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.332 187010 DEBUG nova.virt.libvirt.driver [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.335 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.336 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446903.2775037, 73beb74f-4086-464a-a2ac-addc789a6c70 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.336 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.364 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.367 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446903.279712, 73beb74f-4086-464a-a2ac-addc789a6c70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.367 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.388 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.391 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.395 187010 INFO nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Took 6.92 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.395 187010 DEBUG nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.418 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.449 187010 INFO nova.compute.manager [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Took 7.31 seconds to build instance.#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.724 187010 DEBUG oslo_concurrency.lockutils [None req-9d637506-0eb5-41bc-ad2a-2da097d76a30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.930 187010 DEBUG nova.compute.manager [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.931 187010 DEBUG oslo_concurrency.lockutils [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.931 187010 DEBUG oslo_concurrency.lockutils [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.931 187010 DEBUG oslo_concurrency.lockutils [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.931 187010 DEBUG nova.compute.manager [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] No waiting events found dispatching network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:55:04 np0005555140 nova_compute[187006]: 2025-12-11 09:55:04.932 187010 WARNING nova.compute.manager [req-f95b978d-4656-46fc-bf19-e7e508dab540 req-d91a24f8-5e63-4026-9389-46ff5bff89e6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received unexpected event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:55:06 np0005555140 nova_compute[187006]: 2025-12-11 09:55:06.720 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:06 np0005555140 nova_compute[187006]: 2025-12-11 09:55:06.745 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:07 np0005555140 podman[215272]: 2025-12-11 09:55:07.725979374 +0000 UTC m=+0.094369880 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:55:09 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:09Z|00070|binding|INFO|Releasing lport c8d19757-078a-4c1b-90f0-a2eb4ce2f692 from this chassis (sb_readonly=0)
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.123 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:09 np0005555140 NetworkManager[55531]: <info>  [1765446909.1247] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Dec 11 04:55:09 np0005555140 NetworkManager[55531]: <info>  [1765446909.1252] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 11 04:55:09 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:09Z|00071|binding|INFO|Releasing lport c8d19757-078a-4c1b-90f0-a2eb4ce2f692 from this chassis (sb_readonly=0)
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.158 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.163 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.404 187010 DEBUG nova.compute.manager [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-changed-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.405 187010 DEBUG nova.compute.manager [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing instance network info cache due to event network-changed-466d3171-bc29-4505-a82e-b17655482b51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.405 187010 DEBUG oslo_concurrency.lockutils [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.405 187010 DEBUG oslo_concurrency.lockutils [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:09 np0005555140 nova_compute[187006]: 2025-12-11 09:55:09.406 187010 DEBUG nova.network.neutron [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing network info cache for port 466d3171-bc29-4505-a82e-b17655482b51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:55:11 np0005555140 podman[215293]: 2025-12-11 09:55:11.670303625 +0000 UTC m=+0.049470326 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:55:11 np0005555140 nova_compute[187006]: 2025-12-11 09:55:11.723 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:11 np0005555140 nova_compute[187006]: 2025-12-11 09:55:11.746 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:13 np0005555140 nova_compute[187006]: 2025-12-11 09:55:13.852 187010 DEBUG nova.network.neutron [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updated VIF entry in instance network info cache for port 466d3171-bc29-4505-a82e-b17655482b51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:55:13 np0005555140 nova_compute[187006]: 2025-12-11 09:55:13.853 187010 DEBUG nova.network.neutron [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:13 np0005555140 nova_compute[187006]: 2025-12-11 09:55:13.882 187010 DEBUG oslo_concurrency.lockutils [req-918865dc-6999-469e-88ee-59d20bd50376 req-ae3fedef-b275-4a5e-a9f0-6c9084ca95fa b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:15 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:15Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:38:b9 10.100.0.4
Dec 11 04:55:15 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:15Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:38:b9 10.100.0.4
Dec 11 04:55:15 np0005555140 podman[215333]: 2025-12-11 09:55:15.696986941 +0000 UTC m=+0.058651948 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 11 04:55:15 np0005555140 podman[215332]: 2025-12-11 09:55:15.711465665 +0000 UTC m=+0.084045834 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 11 04:55:16 np0005555140 nova_compute[187006]: 2025-12-11 09:55:16.724 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:16 np0005555140 nova_compute[187006]: 2025-12-11 09:55:16.748 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:21 np0005555140 nova_compute[187006]: 2025-12-11 09:55:21.727 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:21 np0005555140 nova_compute[187006]: 2025-12-11 09:55:21.745 187010 INFO nova.compute.manager [None req-31bacb35-7927-4439-a472-3d66f7cd6903 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Get console output#033[00m
Dec 11 04:55:21 np0005555140 nova_compute[187006]: 2025-12-11 09:55:21.750 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:21 np0005555140 nova_compute[187006]: 2025-12-11 09:55:21.753 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:55:24 np0005555140 podman[215380]: 2025-12-11 09:55:24.676069026 +0000 UTC m=+0.050835084 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:55:24 np0005555140 nova_compute[187006]: 2025-12-11 09:55:24.946 187010 DEBUG nova.compute.manager [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-changed-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:24 np0005555140 nova_compute[187006]: 2025-12-11 09:55:24.947 187010 DEBUG nova.compute.manager [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing instance network info cache due to event network-changed-466d3171-bc29-4505-a82e-b17655482b51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:55:24 np0005555140 nova_compute[187006]: 2025-12-11 09:55:24.947 187010 DEBUG oslo_concurrency.lockutils [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:24 np0005555140 nova_compute[187006]: 2025-12-11 09:55:24.947 187010 DEBUG oslo_concurrency.lockutils [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:24 np0005555140 nova_compute[187006]: 2025-12-11 09:55:24.947 187010 DEBUG nova.network.neutron [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Refreshing network info cache for port 466d3171-bc29-4505-a82e-b17655482b51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:55:26 np0005555140 nova_compute[187006]: 2025-12-11 09:55:26.729 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:26 np0005555140 nova_compute[187006]: 2025-12-11 09:55:26.751 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:27 np0005555140 nova_compute[187006]: 2025-12-11 09:55:27.872 187010 DEBUG nova.network.neutron [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updated VIF entry in instance network info cache for port 466d3171-bc29-4505-a82e-b17655482b51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:55:27 np0005555140 nova_compute[187006]: 2025-12-11 09:55:27.872 187010 DEBUG nova.network.neutron [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:27 np0005555140 nova_compute[187006]: 2025-12-11 09:55:27.908 187010 DEBUG oslo_concurrency.lockutils [req-ed57d426-f139-4283-afd2-194c8ec4f117 req-10a5c843-7cc1-41f8-a958-25db08ba81ae b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:31 np0005555140 nova_compute[187006]: 2025-12-11 09:55:31.732 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:31 np0005555140 nova_compute[187006]: 2025-12-11 09:55:31.753 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:32 np0005555140 podman[215405]: 2025-12-11 09:55:32.69412371 +0000 UTC m=+0.065374561 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 04:55:32 np0005555140 podman[215404]: 2025-12-11 09:55:32.726761893 +0000 UTC m=+0.096355687 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.157 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.157 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.182 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.282 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.283 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.292 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.293 187010 INFO nova.compute.claims [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.406 187010 DEBUG nova.compute.provider_tree [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.422 187010 DEBUG nova.scheduler.client.report [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.442 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.443 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.488 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.488 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.508 187010 INFO nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.536 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.646 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.648 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.649 187010 INFO nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Creating image(s)#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.650 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.650 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.652 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.674 187010 DEBUG nova.policy [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.677 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.765 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.766 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.768 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.790 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.849 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:33 np0005555140 nova_compute[187006]: 2025-12-11 09:55:33.851 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.458 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk 1073741824" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.460 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.461 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.547 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.548 187010 DEBUG nova.virt.disk.api [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.549 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.618 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.619 187010 DEBUG nova.virt.disk.api [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.620 187010 DEBUG nova.objects.instance [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 08f2dd2f-0591-40fb-a7d3-db68e22f407e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.635 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.636 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Ensure instance console log exists: /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.637 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.637 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:34 np0005555140 nova_compute[187006]: 2025-12-11 09:55:34.638 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:36 np0005555140 nova_compute[187006]: 2025-12-11 09:55:36.733 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:36 np0005555140 nova_compute[187006]: 2025-12-11 09:55:36.755 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:36 np0005555140 nova_compute[187006]: 2025-12-11 09:55:36.775 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Successfully created port: dc2871d6-1959-414d-9a58-f15f28494032 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.560 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Successfully updated port: dc2871d6-1959-414d-9a58-f15f28494032 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.580 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.581 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.581 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.660 187010 DEBUG nova.compute.manager [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-changed-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.661 187010 DEBUG nova.compute.manager [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Refreshing instance network info cache due to event network-changed-dc2871d6-1959-414d-9a58-f15f28494032. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.661 187010 DEBUG oslo_concurrency.lockutils [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:38 np0005555140 podman[215459]: 2025-12-11 09:55:38.712960988 +0000 UTC m=+0.073760190 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 04:55:38 np0005555140 nova_compute[187006]: 2025-12-11 09:55:38.754 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:55:39 np0005555140 nova_compute[187006]: 2025-12-11 09:55:39.800 187010 DEBUG nova.network.neutron [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updating instance_info_cache with network_info: [{"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.615 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.616 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Instance network_info: |[{"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.616 187010 DEBUG oslo_concurrency.lockutils [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.617 187010 DEBUG nova.network.neutron [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Refreshing network info cache for port dc2871d6-1959-414d-9a58-f15f28494032 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.620 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Start _get_guest_xml network_info=[{"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.625 187010 WARNING nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.633 187010 DEBUG nova.virt.libvirt.host [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.634 187010 DEBUG nova.virt.libvirt.host [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.640 187010 DEBUG nova.virt.libvirt.host [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.641 187010 DEBUG nova.virt.libvirt.host [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.641 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.641 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.642 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.642 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.642 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.642 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.643 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.643 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.643 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.643 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.644 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.644 187010 DEBUG nova.virt.hardware [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.647 187010 DEBUG nova.virt.libvirt.vif [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:55:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1547843101',display_name='tempest-TestNetworkBasicOps-server-1547843101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1547843101',id=5,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI/HAB35jDnYlAgXSwOdSgEZ8lFtCDNF0w7iX0GHWRXYBFuH9pMMUqDaom/TGoIWxw7txoQ4Oe5AbrZCe/jRLmacRyAjeWwFfXB8zEjFKBDCmRRkmtI5W4ThQSGHYXz/XQ==',key_name='tempest-TestNetworkBasicOps-1790804775',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rup0ei7a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:55:33Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=08f2dd2f-0591-40fb-a7d3-db68e22f407e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.647 187010 DEBUG nova.network.os_vif_util [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.648 187010 DEBUG nova.network.os_vif_util [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.648 187010 DEBUG nova.objects.instance [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 08f2dd2f-0591-40fb-a7d3-db68e22f407e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.943 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <uuid>08f2dd2f-0591-40fb-a7d3-db68e22f407e</uuid>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <name>instance-00000005</name>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1547843101</nova:name>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:55:40</nova:creationTime>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        <nova:port uuid="dc2871d6-1959-414d-9a58-f15f28494032">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="serial">08f2dd2f-0591-40fb-a7d3-db68e22f407e</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="uuid">08f2dd2f-0591-40fb-a7d3-db68e22f407e</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.config"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:82:c1:2a"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <target dev="tapdc2871d6-19"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/console.log" append="off"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:55:40 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:55:40 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:55:40 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:55:40 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.945 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Preparing to wait for external event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.945 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.945 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.945 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.946 187010 DEBUG nova.virt.libvirt.vif [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:55:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1547843101',display_name='tempest-TestNetworkBasicOps-server-1547843101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1547843101',id=5,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI/HAB35jDnYlAgXSwOdSgEZ8lFtCDNF0w7iX0GHWRXYBFuH9pMMUqDaom/TGoIWxw7txoQ4Oe5AbrZCe/jRLmacRyAjeWwFfXB8zEjFKBDCmRRkmtI5W4ThQSGHYXz/XQ==',key_name='tempest-TestNetworkBasicOps-1790804775',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rup0ei7a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:55:33Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=08f2dd2f-0591-40fb-a7d3-db68e22f407e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.946 187010 DEBUG nova.network.os_vif_util [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.946 187010 DEBUG nova.network.os_vif_util [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.947 187010 DEBUG os_vif [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.947 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.947 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.948 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.950 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.950 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc2871d6-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.950 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc2871d6-19, col_values=(('external_ids', {'iface-id': 'dc2871d6-1959-414d-9a58-f15f28494032', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:c1:2a', 'vm-uuid': '08f2dd2f-0591-40fb-a7d3-db68e22f407e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.952 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:40 np0005555140 NetworkManager[55531]: <info>  [1765446940.9539] manager: (tapdc2871d6-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.955 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.961 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:40 np0005555140 nova_compute[187006]: 2025-12-11 09:55:40.962 187010 INFO os_vif [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19')#033[00m
Dec 11 04:55:41 np0005555140 nova_compute[187006]: 2025-12-11 09:55:41.735 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:41 np0005555140 nova_compute[187006]: 2025-12-11 09:55:41.864 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:55:41 np0005555140 nova_compute[187006]: 2025-12-11 09:55:41.865 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:55:41 np0005555140 nova_compute[187006]: 2025-12-11 09:55:41.865 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:82:c1:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:55:41 np0005555140 nova_compute[187006]: 2025-12-11 09:55:41.866 187010 INFO nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Using config drive#033[00m
Dec 11 04:55:42 np0005555140 podman[215483]: 2025-12-11 09:55:42.699444454 +0000 UTC m=+0.063453396 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.673 187010 INFO nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Creating config drive at /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.config#033[00m
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.678 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8riji9ul execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.808 187010 DEBUG oslo_concurrency.processutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8riji9ul" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:44 np0005555140 kernel: tapdc2871d6-19: entered promiscuous mode
Dec 11 04:55:44 np0005555140 NetworkManager[55531]: <info>  [1765446944.8842] manager: (tapdc2871d6-19): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Dec 11 04:55:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:44Z|00072|binding|INFO|Claiming lport dc2871d6-1959-414d-9a58-f15f28494032 for this chassis.
Dec 11 04:55:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:44Z|00073|binding|INFO|dc2871d6-1959-414d-9a58-f15f28494032: Claiming fa:16:3e:82:c1:2a 10.100.0.5
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.886 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.896 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:c1:2a 10.100.0.5'], port_security=['fa:16:3e:82:c1:2a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a7c212f1-a243-4ff7-936f-52fadb00dcc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=579228a5-41f0-484d-8fb8-96b0a910160e, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=dc2871d6-1959-414d-9a58-f15f28494032) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.897 104288 INFO neutron.agent.ovn.metadata.agent [-] Port dc2871d6-1959-414d-9a58-f15f28494032 in datapath 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf bound to our chassis#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.898 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf#033[00m
Dec 11 04:55:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:44Z|00074|binding|INFO|Setting lport dc2871d6-1959-414d-9a58-f15f28494032 ovn-installed in OVS
Dec 11 04:55:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:44Z|00075|binding|INFO|Setting lport dc2871d6-1959-414d-9a58-f15f28494032 up in Southbound
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.900 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:44 np0005555140 nova_compute[187006]: 2025-12-11 09:55:44.903 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.917 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[43f335fb-a257-490d-bf4e-5dfc73eb779f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:44 np0005555140 systemd-udevd[215527]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:55:44 np0005555140 systemd-machined[153398]: New machine qemu-5-instance-00000005.
Dec 11 04:55:44 np0005555140 NetworkManager[55531]: <info>  [1765446944.9336] device (tapdc2871d6-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:55:44 np0005555140 NetworkManager[55531]: <info>  [1765446944.9341] device (tapdc2871d6-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.943 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[4fcafec6-6a55-464d-9dde-3bb1009ef02d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:44 np0005555140 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.948 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6cc1d1-e2d6-496c-8c6b-1c531e1d2121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.976 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[d6bccff6-6c13-4efa-926f-4dee409925b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:44 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:44.992 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[23478284-ce74-45c9-a5ec-e00caaa721c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316d1cbf-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:9c:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326117, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215540, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.008 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d49dfdf3-44dd-4cfa-9f25-80e3a33884d8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap316d1cbf-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326128, 'tstamp': 326128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215542, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap316d1cbf-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326132, 'tstamp': 326132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215542, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.010 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316d1cbf-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.011 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.012 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap316d1cbf-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.012 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.013 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap316d1cbf-30, col_values=(('external_ids', {'iface-id': 'c8d19757-078a-4c1b-90f0-a2eb4ce2f692'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:45 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:45.013 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.289 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446945.2887187, 08f2dd2f-0591-40fb-a7d3-db68e22f407e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.289 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] VM Started (Lifecycle Event)#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.311 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.315 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446945.2894602, 08f2dd2f-0591-40fb-a7d3-db68e22f407e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.315 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.332 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.335 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.353 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.888 187010 DEBUG nova.compute.manager [req-9dca94e4-77f9-4fe2-9612-f534cb1b3e87 req-c7cc2b9f-99ce-4953-82f4-91ca49fd2da7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.889 187010 DEBUG oslo_concurrency.lockutils [req-9dca94e4-77f9-4fe2-9612-f534cb1b3e87 req-c7cc2b9f-99ce-4953-82f4-91ca49fd2da7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.889 187010 DEBUG oslo_concurrency.lockutils [req-9dca94e4-77f9-4fe2-9612-f534cb1b3e87 req-c7cc2b9f-99ce-4953-82f4-91ca49fd2da7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.889 187010 DEBUG oslo_concurrency.lockutils [req-9dca94e4-77f9-4fe2-9612-f534cb1b3e87 req-c7cc2b9f-99ce-4953-82f4-91ca49fd2da7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.890 187010 DEBUG nova.compute.manager [req-9dca94e4-77f9-4fe2-9612-f534cb1b3e87 req-c7cc2b9f-99ce-4953-82f4-91ca49fd2da7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Processing event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.891 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.894 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765446945.8942363, 08f2dd2f-0591-40fb-a7d3-db68e22f407e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.894 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.896 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.899 187010 INFO nova.virt.libvirt.driver [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Instance spawned successfully.#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.900 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.928 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.933 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.933 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.934 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.934 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.935 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.935 187010 DEBUG nova.virt.libvirt.driver [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.940 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.953 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:45 np0005555140 nova_compute[187006]: 2025-12-11 09:55:45.970 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.002 187010 INFO nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Took 12.36 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.003 187010 DEBUG nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.062 187010 INFO nova.compute.manager [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Took 12.82 seconds to build instance.#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.078 187010 DEBUG oslo_concurrency.lockutils [None req-da366bc8-d610-4dc7-bbfc-d7c1b3071c30 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.302 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:46.302 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:55:46 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:46.304 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.340 187010 DEBUG nova.network.neutron [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updated VIF entry in instance network info cache for port dc2871d6-1959-414d-9a58-f15f28494032. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.341 187010 DEBUG nova.network.neutron [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updating instance_info_cache with network_info: [{"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.354 187010 DEBUG oslo_concurrency.lockutils [req-b1109afc-a7dd-4377-8fba-d3fd0011ae17 req-458c078f-cc19-44be-96ad-aa39415775bc b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:46 np0005555140 podman[215551]: 2025-12-11 09:55:46.714088926 +0000 UTC m=+0.070431185 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Dec 11 04:55:46 np0005555140 nova_compute[187006]: 2025-12-11 09:55:46.738 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:46 np0005555140 podman[215550]: 2025-12-11 09:55:46.742325743 +0000 UTC m=+0.108245576 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.989 187010 DEBUG nova.compute.manager [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.989 187010 DEBUG oslo_concurrency.lockutils [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.990 187010 DEBUG oslo_concurrency.lockutils [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.990 187010 DEBUG oslo_concurrency.lockutils [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.990 187010 DEBUG nova.compute.manager [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] No waiting events found dispatching network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:55:47 np0005555140 nova_compute[187006]: 2025-12-11 09:55:47.990 187010 WARNING nova.compute.manager [req-62ebe8c5-ad26-4bed-b603-41e04774c94a req-90b64522-faa7-4a0e-abfe-cb38c04d55c9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received unexpected event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:55:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:48.621 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:48.622 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:48.622 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.084 187010 DEBUG nova.compute.manager [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-changed-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.084 187010 DEBUG nova.compute.manager [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Refreshing instance network info cache due to event network-changed-dc2871d6-1959-414d-9a58-f15f28494032. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.085 187010 DEBUG oslo_concurrency.lockutils [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.085 187010 DEBUG oslo_concurrency.lockutils [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.086 187010 DEBUG nova.network.neutron [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Refreshing network info cache for port dc2871d6-1959-414d-9a58-f15f28494032 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.167 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'name': 'tempest-TestNetworkBasicOps-server-1259410543', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.170 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'name': 'tempest-TestNetworkBasicOps-server-1547843101', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.172 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 73beb74f-4086-464a-a2ac-addc789a6c70 / tap466d3171-bc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.173 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.175 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 08f2dd2f-0591-40fb-a7d3-db68e22f407e / tapdc2871d6-19 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.175 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3917b851-3d37-45f6-baa7-c4a8f085aec7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.170395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92da8e6e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '221ed57f84e451e9194df25d4d93807ac208982af5ab149b3bd5c4bd2fbed9fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.170395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92daeb66-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '0d26b61723077856c288094efc2ddc1c1562b0acc03097a4a2d257c65a218e21'}]}, 'timestamp': '2025-12-11 09:55:50.175990', '_unique_id': '62711401805b4ff18b34aff5618aa843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.178 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.178 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc1d93f9-b005-40ef-bd28-f3e7652a0f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.178281', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92db5056-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': 'f90be6e5773ef0e5ee4b7ab8c0225a034fff78fe5f516d34629da1b6f136ef4e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.178281', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92db5baa-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '0dd0f55e10acde06c9a9aa492697b89b730cf442b741e5bc488714d233bf4ffd'}]}, 'timestamp': '2025-12-11 09:55:50.178811', '_unique_id': '8ce29a0e3c7f4062925eb9d4340b0b4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.216 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.latency volume: 5263693341 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.217 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.270 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.271 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '433baa20-0c07-4984-8137-9b63b8ea280a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5263693341, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.181119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92e12d14-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '9c80e7c304c24647c633ff259c1ff1fb63c91dd24b9ea2bd835bd3b262042a3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.181119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92e144e8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '84e8fccd5446b19d236cdd51b59c2ea04f346c860b3e532b5cf975bc9735e81d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.181119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92e97d52-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': 'bd2a0f098b4b52faf4ad07b0d16c26495fcb3b3864e9282583bb34282a847aad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.181119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92e98a04-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '48404915a4de0ba436cd5196810a665c62561059ff732a2bf660790f2f1bd331'}]}, 'timestamp': '2025-12-11 09:55:50.271791', '_unique_id': 'e1e2ab9fc45049c186484bfb3a017c0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.272 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.273 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.bytes volume: 29448704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.273 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.274 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.274 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a39d6ff-54fb-45bb-b82f-75666397fdda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29448704, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.273623', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92e9dcac-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '256cda954c1ac628a9172154b02f6060683041e019cce22b3c81c6fb39b7c415'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.273623', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92e9e65c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '8d26a9f47d48b54ac5bc3d419c9a2f7e3aa9743aea487a5601bbeae452797891'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.273623', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92e9eee0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': 'e51ec7df689cfa9c3e8371a3c62692fab27d0174274d1025387097e34f6fc99b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.273623', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92e9f71e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '4d0598f0daed7af1395be386c35f476bab621cf0ec8ed5e8ee8a4e4353f56670'}]}, 'timestamp': '2025-12-11 09:55:50.274539', '_unique_id': '73e0ff34f40b41b78e97a9cb5289964b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.275 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.276 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.276 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.276 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f85d3c4-0a67-4957-816a-6edb63afc90e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.275796', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92ea31f2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '38931418a98e2f893d249f3ab8a702194fd6b2fad3a179fdc8edc4b2f0279765'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.275796', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92ea4124-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '89c0cfc21e3ccaad0554f319745e953e6e60caa3c50fdeaabb4957abae149546'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.275796', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92ea4976-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '233ca4107332fe8e542f6b5f492f7f0dafd7579b4fbf7e0d0a1e0f220ae862b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.275796', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92ea5178-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '22e55f0578ee8a64ba666d73ed950d5b0b7e4217b34e3bf8702293b22ea6e1c6'}]}, 'timestamp': '2025-12-11 09:55:50.276847', '_unique_id': 'c1926757e43b435cbec964608c0771d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.277 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.278 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.290 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.291 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.305 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.306 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '842a121a-5eeb-40c7-96ce-157a1b402b2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.278191', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92ec822c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': 'd6b246e25e3ccead144e09a65fc998bce82838e3653d3f196a1781053e03d029'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.278191', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92ec9546-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': '0c80bdf5aacdc415c7ef8f51f6d83d4cc9d130e1b2bfe4a8c1aeb3ea355b99fe'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.278191', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92eecb40-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': 'f45609273fc850e39de3f8d44a59d53d60f3049a691667bff8268fb16e5b7c61'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.278191', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92eedc48-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': '54fa3a4f2d6e82745abc585c9a90dffe9e85938310a89691dacca31335a0aacb'}]}, 'timestamp': '2025-12-11 09:55:50.306718', '_unique_id': 'f1736437688d4ab49e678bb8e0ed9c77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.307 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.309 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.309 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>]
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.309 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.322 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/cpu volume: 10780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.339 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/cpu volume: 4240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf8b7724-9bbd-446e-acf3-dc591ecdd8d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10780000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'timestamp': '2025-12-11T09:55:50.309596', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '92f140a0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.947121224, 'message_signature': '4dcc49bf7600419cf358c3152c42251811010bea050e365635b0c5f4ca03774e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4240000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'timestamp': '2025-12-11T09:55:50.309596', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '92f3ead0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.964548533, 'message_signature': '9ac94f14f7da4a6dab525ed9da2deb90f53ed95bcbd07d537dcb229f910a564e'}]}, 'timestamp': '2025-12-11 09:55:50.339844', '_unique_id': 'db7d42b1cedb4d18aa4ecb3e442366aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.341 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c063ee4e-80c3-4241-b3a0-78d8ff79cba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.341723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f44174-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '26666e3698bcb372d1d8d4257fce6f26505718d2a0cfb0cf8da01b0234034a1b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.341723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f44a0c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '1d896f085807b41e47e69fa4f8ebe81da68c0ed5c762e652ab8c2d04355bb2bd'}]}, 'timestamp': '2025-12-11 09:55:50.342201', '_unique_id': '02964e1ed2584b01a9ec5974ec858f57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.343 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.343 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.343 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.344 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7de78ea7-2033-454c-ba9a-fdc4108e56ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.343534', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f486fc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': 'f747934b756ad8e9eff9c920c5e2a13dc67e0f0eb27a7e5ebda704b5c05d981a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.343534', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f48f58-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': '3109e690b4736222aed6bde0f33e501c3d8e3faa44cae654887529b61ffeac63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.343534', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f49750-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': 'f593172cb687e15887b8dd75ea3b6a643dfc22f86eae5e4fd5596a8e5ec03e0e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.343534', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f4a402-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': '216969d6c4ae16c414e3f05202eefe70526edbbbe62fce96584e110d15b3bf29'}]}, 'timestamp': '2025-12-11 09:55:50.344494', '_unique_id': '36ef6aaee2154f10b96fc525e47eedfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.345 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.outgoing.packets volume: 109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cf28b8d-c61a-49e3-b68b-e97db7d76b23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 109, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.345848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f4e21e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '523d8c4d509458319af35d658639bf126473c6d6457b9507db989bdd5333de7d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.345848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f4ea84-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': 'eed3071c5becad17ea417eb600115bdd98553a552b540a481a4b811825119fee'}]}, 'timestamp': '2025-12-11 09:55:50.346373', '_unique_id': 'cdfa06bcf2924eae87b90f1502baa558'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.346 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.347 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.requests volume: 1068 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.347 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.347 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5cd018a-9f49-4f93-93a2-4f3af0df2101', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1068, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.347557', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f523fa-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': 'e3b1d163d2a1bdd4b6dff5f3f2ccd6b537bc70052effa9b7e57f7b0adabebfbc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.347557', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f52c38-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': 'ced9fe3b9973f8beec3079e0b60428fe969665aaa31affe0c8623373f552359b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.347557', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f533d6-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '3c561a963aee7b47c85b23b3637a70a3123ef320b2106518261df445736b2a16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.347557', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f53ad4-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '0936b3eb6a77b301245eb05822113d9b878b5fb71cb886d738bbb7dbb7bd627b'}]}, 'timestamp': '2025-12-11 09:55:50.348398', '_unique_id': 'f1038201e62e407aa12ed926f458ce34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.348 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.349 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.349 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.latency volume: 183466261 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.349 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.read.latency volume: 21337371 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.latency volume: 188235938 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.read.latency volume: 826964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a1b5e3e-f332-4b1c-b325-1b21f8b6f9dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 183466261, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.349570', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f57288-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '795da3b64931765eb4bb874052ed10cbb378cbb0ed8a12649ef987cf6155566f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21337371, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.349570', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f57b20-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '4904d500131f61c98af46724f1bba881c7153a8e351fa5698bbc329f567443b9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 188235938, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.349570', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f5832c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': 'eb0e5253ba2858bb2aa04aa10c812d50a181d972aa643cb76b262f8fda9a12cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 826964, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.349570', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f58a66-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '8056569cda1997e9d31afdc006f2283dd8a72032a327744831c1a8a81853e7ce'}]}, 'timestamp': '2025-12-11 09:55:50.350432', '_unique_id': 'db209dc577d046d8829cf4c5b0e61f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.350 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.351 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.incoming.bytes volume: 19142 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.351 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cf3a68c-8ddd-4777-9bac-a2e88c20946b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19142, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.351615', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f5c27e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': 'babb48f7c97cdcdc9538ab566323922758216dfcad4439c71d541d47f4152894'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.351615', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f5cbf2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '960cb93c1426cfcca3c1a00e8bb1fcef5e7eade876e7fae136b6e8004b2adca8'}]}, 'timestamp': '2025-12-11 09:55:50.352704', '_unique_id': '739c4799501c43b48ad74ddfe420080f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.354 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.354 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ebeb31e-c21c-4ebc-bba6-20b429935574', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.354040', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f621c4-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '4094c471acb81f8507a3bbe667211bea6b5e877dcfda73429ad8fae3a4e48583'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.354040', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f62a20-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': 'faa9e39c0a0de71f78fe51d5fd84bd9400dc6a89d0e9ae428aeee787724e22de'}]}, 'timestamp': '2025-12-11 09:55:50.354482', '_unique_id': '00b5980c45784ca39ad2612a14f8eb5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.incoming.packets volume: 105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.355 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18758f43-543b-4507-87c9-d3661f382445', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 105, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.355621', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f660b2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': 'ae13d591250d29e098563dd1d1484ca0b9d533fa7c27bf14eb4f807099bda14a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.355621', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f66a8a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '39a021f4537d1427e1f6d4099ba896a7b523ab2c8c209bb3f108f4738ff50af1'}]}, 'timestamp': '2025-12-11 09:55:50.356134', '_unique_id': '0de25d5527b1483082bcdc02161c02c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.356 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.357 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.outgoing.bytes volume: 16018 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.357 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a94d96b-dc73-4932-b069-946a51396a6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16018, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.357226', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f69d98-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': 'ab9a05d032381dfef6152eb44f98b0ac16648ae5e391efeac87d247fdbe4d27d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.357226', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f6a59a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': 'f4d5223390beb95b25adfeeed76ec1d6adc60700c7db5fbce3f3427756c6faf4'}]}, 'timestamp': '2025-12-11 09:55:50.357644', '_unique_id': 'a41ae7238907433ab44cc5e0dbfc583b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.358 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd46485a7-8d23-4b79-ade9-5ac7285d18db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.358748', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f6dba0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '67c13dac64c8d9edfbe1e2fac526d8e1145d5555d46bc555ef20f748c1dc3343'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.358748', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f6e3ac-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '387a7488507b3d5a7ba9223c1d5dac0e71f195d7573f37b05cf08d410cf3c54c'}]}, 'timestamp': '2025-12-11 09:55:50.359232', '_unique_id': '912ca2eca2c944bca93f79188650dd3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.359 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>]
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.360 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e9138c3-919e-4f28-852b-4ce018aa06d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000004-73beb74f-4086-464a-a2ac-addc789a6c70-tap466d3171-bc', 'timestamp': '2025-12-11T09:55:50.360603', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'tap466d3171-bc', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:38:b9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap466d3171-bc'}, 'message_id': '92f72164-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.795594021, 'message_signature': '15c4e3af6c0c79773910cd2f534757c92a13e591d81c8c641f6eeab1fafd2e87'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000005-08f2dd2f-0591-40fb-a7d3-db68e22f407e-tapdc2871d6-19', 'timestamp': '2025-12-11T09:55:50.360603', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'tapdc2871d6-19', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:c1:2a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc2871d6-19'}, 'message_id': '92f72c18-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.798844264, 'message_signature': '47195409409a2c85ea8b885606cee3919577290d7a551888fb8fb6b1a185db21'}]}, 'timestamp': '2025-12-11 09:55:50.361087', '_unique_id': '993f5364b9c74fb3bd86692a770bf022'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.361 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>]
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1259410543>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1547843101>]
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.bytes volume: 72994816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.362 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.363 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.363 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7a8d0dc-c9fc-404c-8f38-a3ceb86d7bec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72994816, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.362708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f7739e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': '374bb7c29bfffbd0667efc5fc3149984360093bf6ce626cfc7f36e43246c4ccf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.362708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f77d8a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.806318187, 'message_signature': 'ee750eda2031c76d8f76007ed3eede2c9258d9c5a6768a4719bb4d0465d856d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.362708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f784d8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '891fe8230f102adb330f1caec4e75d0dce55dac2c31f440aa2327416dfb07c9f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.362708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f78bea-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.842869703, 'message_signature': '3cd369e015802dfe8ac448af60a27bde2f2bee9305098c665706d6417c488ed9'}]}, 'timestamp': '2025-12-11 09:55:50.363531', '_unique_id': '0e17318d125a4e9ba12704d823503245'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.364 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/memory.usage volume: 42.828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 08f2dd2f-0591-40fb-a7d3-db68e22f407e: ceilometer.compute.pollsters.NoVolumeException
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e16617d-9ef9-4bc6-9ec9-fc62ab5f94f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.828125, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'timestamp': '2025-12-11T09:55:50.365025', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '92f7ce52-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.947121224, 'message_signature': 'de1dfbac1cd7dc3d260338f776d6751d1cfb4c8bc3f7c73d1f9e7336709961bc'}]}, 'timestamp': '2025-12-11 09:55:50.365395', '_unique_id': 'bad4152a66314f67aac51857012595c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.365 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.366 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.366 12 DEBUG ceilometer.compute.pollsters [-] 73beb74f-4086-464a-a2ac-addc789a6c70/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.366 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 DEBUG ceilometer.compute.pollsters [-] 08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a838c42c-2856-46f6-8580-efa864815dae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-vda', 'timestamp': '2025-12-11T09:55:50.366454', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f805de-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': '30ef28265c21c741e1cd7330255c5ae9874a9b648261367b89574fb897539ced'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '73beb74f-4086-464a-a2ac-addc789a6c70-sda', 'timestamp': '2025-12-11T09:55:50.366454', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1259410543', 'name': 'instance-00000004', 'instance_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f80e44-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.903400084, 'message_signature': '4c39bde3807c7d80381e41802196e7df79482d50063e6f99a12e8aac0386cd89'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-vda', 'timestamp': '2025-12-11T09:55:50.366454', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f817cc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': '966d926974b5632f416e22a55f6ce9b567cd1d92fcb5810768cac28975a79651'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e-sda', 'timestamp': '2025-12-11T09:55:50.366454', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1547843101', 'name': 'instance-00000005', 'instance_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f81f10-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3308.916930221, 'message_signature': '26838117f6fcc8ae106b104c7b6a935f09f3bd5acbb3a3d12dd3bfb660620094'}]}, 'timestamp': '2025-12-11 09:55:50.367295', '_unique_id': '05115aa1c96546228e9a2bf7a8c074d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:55:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:55:50.367 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:50 np0005555140 nova_compute[187006]: 2025-12-11 09:55:50.955 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:51 np0005555140 nova_compute[187006]: 2025-12-11 09:55:51.307 187010 DEBUG nova.network.neutron [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updated VIF entry in instance network info cache for port dc2871d6-1959-414d-9a58-f15f28494032. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:55:51 np0005555140 nova_compute[187006]: 2025-12-11 09:55:51.307 187010 DEBUG nova.network.neutron [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updating instance_info_cache with network_info: [{"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:51 np0005555140 nova_compute[187006]: 2025-12-11 09:55:51.324 187010 DEBUG oslo_concurrency.lockutils [req-670c4d5a-cd1c-43e7-aa77-bbc9e90f685d req-adb81882-2647-4baf-ae49-abd6d6961fab b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-08f2dd2f-0591-40fb-a7d3-db68e22f407e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:51 np0005555140 nova_compute[187006]: 2025-12-11 09:55:51.740 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:52 np0005555140 nova_compute[187006]: 2025-12-11 09:55:52.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:52 np0005555140 nova_compute[187006]: 2025-12-11 09:55:52.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:55:52 np0005555140 nova_compute[187006]: 2025-12-11 09:55:52.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:55:53 np0005555140 nova_compute[187006]: 2025-12-11 09:55:53.673 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:55:53 np0005555140 nova_compute[187006]: 2025-12-11 09:55:53.673 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:55:53 np0005555140 nova_compute[187006]: 2025-12-11 09:55:53.674 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 04:55:53 np0005555140 nova_compute[187006]: 2025-12-11 09:55:53.674 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 73beb74f-4086-464a-a2ac-addc789a6c70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:55:54 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:55:54.307 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:55:55 np0005555140 podman[215596]: 2025-12-11 09:55:55.676752342 +0000 UTC m=+0.052453711 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.700 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [{"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.723 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-73beb74f-4086-464a-a2ac-addc789a6c70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.724 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.724 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.724 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.725 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.762 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.763 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.763 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.763 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.846 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.928 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.929 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.957 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:55 np0005555140 nova_compute[187006]: 2025-12-11 09:55:55.998 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.006 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.095 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.096 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.151 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.318 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.319 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5437MB free_disk=73.29906463623047GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.319 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.320 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.541 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 73beb74f-4086-464a-a2ac-addc789a6c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.542 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 08f2dd2f-0591-40fb-a7d3-db68e22f407e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.542 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.542 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.591 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.745 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.810 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.837 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.838 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.518s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.943 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.943 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.944 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:56 np0005555140 nova_compute[187006]: 2025-12-11 09:55:56.944 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:55:57 np0005555140 nova_compute[187006]: 2025-12-11 09:55:57.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:55:57 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:57Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:c1:2a 10.100.0.5
Dec 11 04:55:57 np0005555140 ovn_controller[95438]: 2025-12-11T09:55:57Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:c1:2a 10.100.0.5
Dec 11 04:56:00 np0005555140 nova_compute[187006]: 2025-12-11 09:56:00.962 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:01 np0005555140 nova_compute[187006]: 2025-12-11 09:56:01.744 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.588 187010 INFO nova.compute.manager [None req-9d2012f4-ff7a-46a6-8b9d-2897bdf07d76 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Get console output#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.594 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:56:03 np0005555140 podman[215645]: 2025-12-11 09:56:03.693688212 +0000 UTC m=+0.065516554 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:56:03 np0005555140 podman[215644]: 2025-12-11 09:56:03.730921637 +0000 UTC m=+0.099421514 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.903 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.903 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.904 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.904 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.904 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.905 187010 INFO nova.compute.manager [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Terminating instance#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.906 187010 DEBUG nova.compute.manager [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:56:03 np0005555140 kernel: tapdc2871d6-19 (unregistering): left promiscuous mode
Dec 11 04:56:03 np0005555140 NetworkManager[55531]: <info>  [1765446963.9283] device (tapdc2871d6-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.934 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:03 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:03Z|00076|binding|INFO|Releasing lport dc2871d6-1959-414d-9a58-f15f28494032 from this chassis (sb_readonly=0)
Dec 11 04:56:03 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:03Z|00077|binding|INFO|Setting lport dc2871d6-1959-414d-9a58-f15f28494032 down in Southbound
Dec 11 04:56:03 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:03Z|00078|binding|INFO|Removing iface tapdc2871d6-19 ovn-installed in OVS
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.938 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:03.945 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:c1:2a 10.100.0.5'], port_security=['fa:16:3e:82:c1:2a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '08f2dd2f-0591-40fb-a7d3-db68e22f407e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a7c212f1-a243-4ff7-936f-52fadb00dcc7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=579228a5-41f0-484d-8fb8-96b0a910160e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=dc2871d6-1959-414d-9a58-f15f28494032) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:56:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:03.948 104288 INFO neutron.agent.ovn.metadata.agent [-] Port dc2871d6-1959-414d-9a58-f15f28494032 in datapath 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf unbound from our chassis#033[00m
Dec 11 04:56:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:03.950 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf#033[00m
Dec 11 04:56:03 np0005555140 nova_compute[187006]: 2025-12-11 09:56:03.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:03.969 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7607200a-7cae-44ef-aaff-b29d29621848]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:03 np0005555140 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 11 04:56:03 np0005555140 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.061s CPU time.
Dec 11 04:56:03 np0005555140 systemd-machined[153398]: Machine qemu-5-instance-00000005 terminated.
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.002 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[8a332fbf-c5f4-4f14-880a-8e84d20016a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.006 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[3d76d8bf-8a43-4b4b-8a48-4e5c6d487792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.037 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[03c14307-cae7-47ff-94e4-174dc8611029]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.059 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[08d7f8b5-b9eb-459a-8987-9353e373d1d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316d1cbf-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:9c:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326117, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215693, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.078 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe2a697-855c-4d09-adfc-003b512c1a42]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap316d1cbf-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326128, 'tstamp': 326128}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215694, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap316d1cbf-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326132, 'tstamp': 326132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215694, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.080 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316d1cbf-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.081 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.084 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.085 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap316d1cbf-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.085 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.086 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap316d1cbf-30, col_values=(('external_ids', {'iface-id': 'c8d19757-078a-4c1b-90f0-a2eb4ce2f692'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:04 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:04.086 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.129 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.134 187010 DEBUG nova.compute.manager [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-unplugged-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.134 187010 DEBUG oslo_concurrency.lockutils [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.134 187010 DEBUG oslo_concurrency.lockutils [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.135 187010 DEBUG oslo_concurrency.lockutils [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.135 187010 DEBUG nova.compute.manager [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] No waiting events found dispatching network-vif-unplugged-dc2871d6-1959-414d-9a58-f15f28494032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.135 187010 DEBUG nova.compute.manager [req-c8573b38-24f1-4fd1-a070-b04e017ff6d9 req-26c6d297-df6a-4163-8820-bc5f64532af3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-unplugged-dc2871d6-1959-414d-9a58-f15f28494032 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.136 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.174 187010 INFO nova.virt.libvirt.driver [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Instance destroyed successfully.#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.175 187010 DEBUG nova.objects.instance [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 08f2dd2f-0591-40fb-a7d3-db68e22f407e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.187 187010 DEBUG nova.virt.libvirt.vif [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:55:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1547843101',display_name='tempest-TestNetworkBasicOps-server-1547843101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1547843101',id=5,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI/HAB35jDnYlAgXSwOdSgEZ8lFtCDNF0w7iX0GHWRXYBFuH9pMMUqDaom/TGoIWxw7txoQ4Oe5AbrZCe/jRLmacRyAjeWwFfXB8zEjFKBDCmRRkmtI5W4ThQSGHYXz/XQ==',key_name='tempest-TestNetworkBasicOps-1790804775',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:55:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-rup0ei7a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:55:46Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=08f2dd2f-0591-40fb-a7d3-db68e22f407e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.187 187010 DEBUG nova.network.os_vif_util [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "dc2871d6-1959-414d-9a58-f15f28494032", "address": "fa:16:3e:82:c1:2a", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2871d6-19", "ovs_interfaceid": "dc2871d6-1959-414d-9a58-f15f28494032", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.188 187010 DEBUG nova.network.os_vif_util [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.189 187010 DEBUG os_vif [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.190 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.190 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc2871d6-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.191 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.193 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.195 187010 INFO os_vif [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:c1:2a,bridge_name='br-int',has_traffic_filtering=True,id=dc2871d6-1959-414d-9a58-f15f28494032,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2871d6-19')#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.195 187010 INFO nova.virt.libvirt.driver [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Deleting instance files /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e_del#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.196 187010 INFO nova.virt.libvirt.driver [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Deletion of /var/lib/nova/instances/08f2dd2f-0591-40fb-a7d3-db68e22f407e_del complete#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.260 187010 INFO nova.compute.manager [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.261 187010 DEBUG oslo.service.loopingcall [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.261 187010 DEBUG nova.compute.manager [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:56:04 np0005555140 nova_compute[187006]: 2025-12-11 09:56:04.262 187010 DEBUG nova.network.neutron [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.483 187010 DEBUG nova.network.neutron [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.502 187010 INFO nova.compute.manager [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Took 1.24 seconds to deallocate network for instance.#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.547 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.547 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.552 187010 DEBUG nova.compute.manager [req-b50a4d7f-14bd-453c-81cf-30c115769ee0 req-59efb488-b8af-4da2-b0ed-b38996955c91 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-deleted-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.626 187010 DEBUG nova.compute.provider_tree [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.643 187010 DEBUG nova.scheduler.client.report [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.665 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.693 187010 INFO nova.scheduler.client.report [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 08f2dd2f-0591-40fb-a7d3-db68e22f407e#033[00m
Dec 11 04:56:05 np0005555140 nova_compute[187006]: 2025-12-11 09:56:05.747 187010 DEBUG oslo_concurrency.lockutils [None req-1fbd4cd9-691f-4c05-8f6b-e14c7270c8dc 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.374 187010 DEBUG nova.compute.manager [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.374 187010 DEBUG oslo_concurrency.lockutils [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.375 187010 DEBUG oslo_concurrency.lockutils [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.375 187010 DEBUG oslo_concurrency.lockutils [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "08f2dd2f-0591-40fb-a7d3-db68e22f407e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.375 187010 DEBUG nova.compute.manager [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] No waiting events found dispatching network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.375 187010 WARNING nova.compute.manager [req-6580e87b-a7ed-4610-86d9-a6722250db19 req-d21119f0-3a8b-4cb0-99cb-d59c711d9864 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Received unexpected event network-vif-plugged-dc2871d6-1959-414d-9a58-f15f28494032 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:56:06 np0005555140 nova_compute[187006]: 2025-12-11 09:56:06.747 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.041 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.041 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.042 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.042 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.042 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.043 187010 INFO nova.compute.manager [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Terminating instance#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.044 187010 DEBUG nova.compute.manager [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:56:07 np0005555140 kernel: tap466d3171-bc (unregistering): left promiscuous mode
Dec 11 04:56:07 np0005555140 NetworkManager[55531]: <info>  [1765446967.0686] device (tap466d3171-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:56:07 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:07Z|00079|binding|INFO|Releasing lport 466d3171-bc29-4505-a82e-b17655482b51 from this chassis (sb_readonly=0)
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.074 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:07Z|00080|binding|INFO|Setting lport 466d3171-bc29-4505-a82e-b17655482b51 down in Southbound
Dec 11 04:56:07 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:07Z|00081|binding|INFO|Removing iface tap466d3171-bc ovn-installed in OVS
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.078 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.091 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:38:b9 10.100.0.4'], port_security=['fa:16:3e:03:38:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '73beb74f-4086-464a-a2ac-addc789a6c70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7dff39d4-6c24-429a-8738-9686bb4cece9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=579228a5-41f0-484d-8fb8-96b0a910160e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=466d3171-bc29-4505-a82e-b17655482b51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.093 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.095 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 466d3171-bc29-4505-a82e-b17655482b51 in datapath 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf unbound from our chassis#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.097 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.098 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa39f3f-0c64-4215-bb7c-1ebd0af3a209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.098 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf namespace which is not needed anymore#033[00m
Dec 11 04:56:07 np0005555140 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 11 04:56:07 np0005555140 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 14.100s CPU time.
Dec 11 04:56:07 np0005555140 systemd-machined[153398]: Machine qemu-4-instance-00000004 terminated.
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [NOTICE]   (215260) : haproxy version is 2.8.14-c23fe91
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [NOTICE]   (215260) : path to executable is /usr/sbin/haproxy
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [WARNING]  (215260) : Exiting Master process...
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [WARNING]  (215260) : Exiting Master process...
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [ALERT]    (215260) : Current worker (215262) exited with code 143 (Terminated)
Dec 11 04:56:07 np0005555140 neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf[215256]: [WARNING]  (215260) : All workers exited. Exiting... (0)
Dec 11 04:56:07 np0005555140 systemd[1]: libpod-884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2.scope: Deactivated successfully.
Dec 11 04:56:07 np0005555140 podman[215737]: 2025-12-11 09:56:07.257733207 +0000 UTC m=+0.053029567 container died 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:56:07 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2-userdata-shm.mount: Deactivated successfully.
Dec 11 04:56:07 np0005555140 systemd[1]: var-lib-containers-storage-overlay-647f1ed7a4743482092546565bb541ff52af18c0f91fc001b7101532b6c68d1f-merged.mount: Deactivated successfully.
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.313 187010 INFO nova.virt.libvirt.driver [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Instance destroyed successfully.#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.315 187010 DEBUG nova.objects.instance [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 73beb74f-4086-464a-a2ac-addc789a6c70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:56:07 np0005555140 podman[215737]: 2025-12-11 09:56:07.316439686 +0000 UTC m=+0.111736006 container cleanup 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 04:56:07 np0005555140 systemd[1]: libpod-conmon-884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2.scope: Deactivated successfully.
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.332 187010 DEBUG nova.virt.libvirt.vif [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1259410543',display_name='tempest-TestNetworkBasicOps-server-1259410543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1259410543',id=4,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONeMUed8EI2O0baeaJqPw83CgKBoj8BW4j9gYw/NtVLiRZG7S5SH8/gYx4O5Tj31WYQUncS3fWovBKgoJyY/mxtL19J3YMn/K4n7p4Y5xezpDxqNzH4AbNzYAMNJVh8+A==',key_name='tempest-TestNetworkBasicOps-2100690359',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:55:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-vvooq7ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:55:04Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=73beb74f-4086-464a-a2ac-addc789a6c70,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.333 187010 DEBUG nova.network.os_vif_util [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466d3171-bc29-4505-a82e-b17655482b51", "address": "fa:16:3e:03:38:b9", "network": {"id": "316d1cbf-3d8b-4adf-8c17-c7fde41c1daf", "bridge": "br-int", "label": "tempest-network-smoke--1128977112", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466d3171-bc", "ovs_interfaceid": "466d3171-bc29-4505-a82e-b17655482b51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.333 187010 DEBUG nova.network.os_vif_util [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.334 187010 DEBUG os_vif [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.335 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.335 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap466d3171-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.337 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.338 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.340 187010 INFO os_vif [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:38:b9,bridge_name='br-int',has_traffic_filtering=True,id=466d3171-bc29-4505-a82e-b17655482b51,network=Network(316d1cbf-3d8b-4adf-8c17-c7fde41c1daf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466d3171-bc')#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.340 187010 INFO nova.virt.libvirt.driver [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Deleting instance files /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70_del#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.341 187010 INFO nova.virt.libvirt.driver [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Deletion of /var/lib/nova/instances/73beb74f-4086-464a-a2ac-addc789a6c70_del complete#033[00m
Dec 11 04:56:07 np0005555140 podman[215786]: 2025-12-11 09:56:07.387363095 +0000 UTC m=+0.045344158 container remove 884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.392 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b621638e-8899-4450-9e36-b70bbcf65353]: (4, ('Thu Dec 11 09:56:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf (884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2)\n884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2\nThu Dec 11 09:56:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf (884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2)\n884abe97e9e3df396d295b1b445e6c77051dafcdff851a8998604400175601e2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.394 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0153a996-ba6d-4aed-b677-e8d2b25ce895]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.394 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316d1cbf-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.396 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 kernel: tap316d1cbf-30: left promiscuous mode
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.398 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.400 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[25760974-f4ec-45dd-bb3a-f80434691901]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.413 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.417 187010 INFO nova.compute.manager [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.417 187010 DEBUG oslo.service.loopingcall [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.418 187010 DEBUG nova.compute.manager [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.418 187010 DEBUG nova.network.neutron [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.423 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[daf04d5e-e99b-40c2-b32c-029581ab4131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.424 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6155c6-b74c-4c19-be87-411cf27d1da8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.440 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd48cb1-2aac-471e-bcb9-f1a22681a4c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326109, 'reachable_time': 20866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215801, 'error': None, 'target': 'ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.443 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-316d1cbf-3d8b-4adf-8c17-c7fde41c1daf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:56:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:07.443 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[1052b1ba-e1da-434d-9e26-9fca4be2cf2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:07 np0005555140 systemd[1]: run-netns-ovnmeta\x2d316d1cbf\x2d3d8b\x2d4adf\x2d8c17\x2dc7fde41c1daf.mount: Deactivated successfully.
Dec 11 04:56:07 np0005555140 nova_compute[187006]: 2025-12-11 09:56:07.971 187010 DEBUG nova.network.neutron [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.009 187010 INFO nova.compute.manager [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Took 0.59 seconds to deallocate network for instance.#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.048 187010 DEBUG nova.compute.manager [req-d1e493bf-c65f-47c4-8b9b-28698461e591 req-2a7c210b-5ddf-4048-862b-2b5e5995005d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-vif-deleted-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.085 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.086 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.138 187010 DEBUG nova.compute.provider_tree [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.155 187010 DEBUG nova.scheduler.client.report [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.174 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.204 187010 INFO nova.scheduler.client.report [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 73beb74f-4086-464a-a2ac-addc789a6c70#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.266 187010 DEBUG oslo_concurrency.lockutils [None req-bc64e3cd-1691-4c55-a07d-e9702b4fb9f5 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.452 187010 DEBUG nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-vif-unplugged-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.452 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.453 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.453 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.453 187010 DEBUG nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] No waiting events found dispatching network-vif-unplugged-466d3171-bc29-4505-a82e-b17655482b51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.453 187010 WARNING nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received unexpected event network-vif-unplugged-466d3171-bc29-4505-a82e-b17655482b51 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.453 187010 DEBUG nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.454 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.454 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.454 187010 DEBUG oslo_concurrency.lockutils [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "73beb74f-4086-464a-a2ac-addc789a6c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.455 187010 DEBUG nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] No waiting events found dispatching network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:56:08 np0005555140 nova_compute[187006]: 2025-12-11 09:56:08.455 187010 WARNING nova.compute.manager [req-a22e3855-0e65-4dd6-981a-bdc209be3d71 req-567c9bfd-5f6a-4ca1-ae12-968e4ec8d808 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Received unexpected event network-vif-plugged-466d3171-bc29-4505-a82e-b17655482b51 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:56:09 np0005555140 podman[215802]: 2025-12-11 09:56:09.702201985 +0000 UTC m=+0.076552881 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 04:56:11 np0005555140 nova_compute[187006]: 2025-12-11 09:56:11.749 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:12 np0005555140 nova_compute[187006]: 2025-12-11 09:56:12.337 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:13 np0005555140 podman[215823]: 2025-12-11 09:56:13.710203937 +0000 UTC m=+0.075267833 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:56:14 np0005555140 nova_compute[187006]: 2025-12-11 09:56:14.087 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:14 np0005555140 nova_compute[187006]: 2025-12-11 09:56:14.162 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:16 np0005555140 nova_compute[187006]: 2025-12-11 09:56:16.754 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:16 np0005555140 podman[215851]: 2025-12-11 09:56:16.872244476 +0000 UTC m=+0.080532184 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 04:56:16 np0005555140 podman[215850]: 2025-12-11 09:56:16.882833509 +0000 UTC m=+0.101219766 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 11 04:56:17 np0005555140 nova_compute[187006]: 2025-12-11 09:56:17.339 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:19 np0005555140 nova_compute[187006]: 2025-12-11 09:56:19.173 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765446964.172435, 08f2dd2f-0591-40fb-a7d3-db68e22f407e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:56:19 np0005555140 nova_compute[187006]: 2025-12-11 09:56:19.173 187010 INFO nova.compute.manager [-] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:56:19 np0005555140 nova_compute[187006]: 2025-12-11 09:56:19.193 187010 DEBUG nova.compute.manager [None req-bdd8838b-7a8e-4509-98c1-647be1d8e856 - - - - - -] [instance: 08f2dd2f-0591-40fb-a7d3-db68e22f407e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:21 np0005555140 nova_compute[187006]: 2025-12-11 09:56:21.756 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:22 np0005555140 nova_compute[187006]: 2025-12-11 09:56:22.313 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765446967.3106132, 73beb74f-4086-464a-a2ac-addc789a6c70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:56:22 np0005555140 nova_compute[187006]: 2025-12-11 09:56:22.313 187010 INFO nova.compute.manager [-] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:56:22 np0005555140 nova_compute[187006]: 2025-12-11 09:56:22.338 187010 DEBUG nova.compute.manager [None req-5b6f89f6-f637-4879-b4d7-0ea8d9c47a59 - - - - - -] [instance: 73beb74f-4086-464a-a2ac-addc789a6c70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:22 np0005555140 nova_compute[187006]: 2025-12-11 09:56:22.341 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:26 np0005555140 podman[215898]: 2025-12-11 09:56:26.676311446 +0000 UTC m=+0.053628645 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:56:26 np0005555140 nova_compute[187006]: 2025-12-11 09:56:26.759 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:27 np0005555140 nova_compute[187006]: 2025-12-11 09:56:27.343 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.024 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.025 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.039 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.106 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.107 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.115 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.116 187010 INFO nova.compute.claims [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.471 187010 DEBUG nova.compute.provider_tree [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.684 187010 DEBUG nova.scheduler.client.report [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.710 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.711 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.753 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.754 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.760 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.771 187010 INFO nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:56:31 np0005555140 nova_compute[187006]: 2025-12-11 09:56:31.791 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.022 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.024 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.024 187010 INFO nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Creating image(s)#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.025 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.026 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.027 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.051 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.109 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.111 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.113 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.129 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.187 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.189 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.346 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:32 np0005555140 nova_compute[187006]: 2025-12-11 09:56:32.738 187010 DEBUG nova.policy [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.459 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk 1073741824" returned: 0 in 1.271s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.460 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.460 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.535 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.536 187010 DEBUG nova.virt.disk.api [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.537 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.614 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.615 187010 DEBUG nova.virt.disk.api [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.616 187010 DEBUG nova.objects.instance [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.635 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.636 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Ensure instance console log exists: /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.637 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.638 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:33 np0005555140 nova_compute[187006]: 2025-12-11 09:56:33.638 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:34 np0005555140 podman[215937]: 2025-12-11 09:56:34.739272923 +0000 UTC m=+0.099494817 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:56:34 np0005555140 podman[215938]: 2025-12-11 09:56:34.746449418 +0000 UTC m=+0.104452218 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 04:56:34 np0005555140 nova_compute[187006]: 2025-12-11 09:56:34.757 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Successfully created port: 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.452 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Successfully updated port: 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.470 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.470 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.471 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.562 187010 DEBUG nova.compute.manager [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.562 187010 DEBUG nova.compute.manager [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing instance network info cache due to event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.563 187010 DEBUG oslo_concurrency.lockutils [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:56:35 np0005555140 nova_compute[187006]: 2025-12-11 09:56:35.633 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:56:36 np0005555140 nova_compute[187006]: 2025-12-11 09:56:36.761 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:37 np0005555140 nova_compute[187006]: 2025-12-11 09:56:37.349 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:37 np0005555140 nova_compute[187006]: 2025-12-11 09:56:37.717 187010 DEBUG nova.network.neutron [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.555 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.556 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance network_info: |[{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.557 187010 DEBUG oslo_concurrency.lockutils [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.557 187010 DEBUG nova.network.neutron [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.561 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Start _get_guest_xml network_info=[{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.565 187010 WARNING nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.576 187010 DEBUG nova.virt.libvirt.host [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.577 187010 DEBUG nova.virt.libvirt.host [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.584 187010 DEBUG nova.virt.libvirt.host [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.584 187010 DEBUG nova.virt.libvirt.host [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.585 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.585 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.586 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.586 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.586 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.586 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.587 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.587 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.587 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.588 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.588 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.588 187010 DEBUG nova.virt.hardware [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.592 187010 DEBUG nova.virt.libvirt.vif [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:56:31Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.593 187010 DEBUG nova.network.os_vif_util [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.593 187010 DEBUG nova.network.os_vif_util [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.595 187010 DEBUG nova.objects.instance [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.609 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <uuid>d1980092-5559-4d0d-a6cc-184b22110cc4</uuid>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <name>instance-00000006</name>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:56:39</nova:creationTime>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="serial">d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="uuid">d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:96:8b:66"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <target dev="tap3787f1e5-cc"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log" append="off"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:56:39 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:56:39 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:56:39 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:56:39 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.610 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Preparing to wait for external event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.610 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.611 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.611 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.612 187010 DEBUG nova.virt.libvirt.vif [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:56:31Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.612 187010 DEBUG nova.network.os_vif_util [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.613 187010 DEBUG nova.network.os_vif_util [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.613 187010 DEBUG os_vif [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.613 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.614 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.614 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.617 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.618 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3787f1e5-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.618 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3787f1e5-cc, col_values=(('external_ids', {'iface-id': '3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:8b:66', 'vm-uuid': 'd1980092-5559-4d0d-a6cc-184b22110cc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:39 np0005555140 NetworkManager[55531]: <info>  [1765446999.6207] manager: (tap3787f1e5-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.622 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.625 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.626 187010 INFO os_vif [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc')#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.683 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.684 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.684 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:96:8b:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.685 187010 INFO nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Using config drive#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.941 187010 INFO nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Creating config drive at /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config#033[00m
Dec 11 04:56:39 np0005555140 nova_compute[187006]: 2025-12-11 09:56:39.946 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7a997ua execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.067 187010 DEBUG oslo_concurrency.processutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7a997ua" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:40 np0005555140 kernel: tap3787f1e5-cc: entered promiscuous mode
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.137 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.1380] manager: (tap3787f1e5-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 11 04:56:40 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:40Z|00082|binding|INFO|Claiming lport 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 for this chassis.
Dec 11 04:56:40 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:40Z|00083|binding|INFO|3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5: Claiming fa:16:3e:96:8b:66 10.100.0.8
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.141 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.151 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:8b:66 10.100.0.8'], port_security=['fa:16:3e:96:8b:66 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6076e77-6bbe-422f-b8fa-650192fcd178', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f4b45b33-eea6-4cc0-a8ef-ccc6fc496694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e75b9db-d641-4613-b412-1310e816e31f, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.152 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 in datapath f6076e77-6bbe-422f-b8fa-650192fcd178 bound to our chassis#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.153 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6076e77-6bbe-422f-b8fa-650192fcd178#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.165 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[31161483-245d-41d4-94e9-ba847d14ed1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.166 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6076e77-61 in ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.167 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6076e77-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.168 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6bf74e-3bf9-4cd0-876d-8cf47c01d1b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.168 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[18ab222f-fd16-4405-a64c-2506acfe4aa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 systemd-udevd[216006]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.179 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[f7c8c056-6b04-44bd-911f-5430084a85a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 systemd-machined[153398]: New machine qemu-6-instance-00000006.
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.1924] device (tap3787f1e5-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.1936] device (tap3787f1e5-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.194 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Dec 11 04:56:40 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:40Z|00084|binding|INFO|Setting lport 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 ovn-installed in OVS
Dec 11 04:56:40 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:40Z|00085|binding|INFO|Setting lport 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 up in Southbound
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.200 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.204 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[43c0f22a-fd41-4761-8f84-a80803e8ffe3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 podman[215989]: 2025-12-11 09:56:40.216741548 +0000 UTC m=+0.078757483 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.236 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[570b865f-6deb-4f8a-a80b-1875135a502b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.243 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcc4056-5832-4a1b-adb2-3bca91ff4975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.2445] manager: (tapf6076e77-60): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.273 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[7577e68b-d064-48e7-ba5d-41a5bd5f99d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.276 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[19599db9-02e4-4a56-b46b-2947fbd2986c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.2973] device (tapf6076e77-60): carrier: link connected
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.300 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[52b07f5f-3b83-415b-937c-cdfe69823a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.317 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e6c466-a2e8-43af-9cb5-2a57ad75b508]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6076e77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:a0:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335886, 'reachable_time': 34470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216045, 'error': None, 'target': 'ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.330 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[520d6fe2-31ac-42b2-b3fc-153e27d87a16]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:a092'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 335886, 'tstamp': 335886}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216046, 'error': None, 'target': 'ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.351 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[40e3ceaa-61cd-4880-9202-c714859bd015]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6076e77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:a0:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335886, 'reachable_time': 34470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216047, 'error': None, 'target': 'ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.386 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d1ace9-1cb2-4358-8789-e19bcc23e4cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.445 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbd33bf-f3fe-4a7a-9926-d5fe4ee25ae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.447 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6076e77-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.447 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.448 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6076e77-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.449 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 NetworkManager[55531]: <info>  [1765447000.4502] manager: (tapf6076e77-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec 11 04:56:40 np0005555140 kernel: tapf6076e77-60: entered promiscuous mode
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.452 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6076e77-60, col_values=(('external_ids', {'iface-id': 'be72e3a5-1e58-485d-bd82-34fa36f9e281'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.453 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:40Z|00086|binding|INFO|Releasing lport be72e3a5-1e58-485d-bd82-34fa36f9e281 from this chassis (sb_readonly=0)
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.454 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.455 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6076e77-6bbe-422f-b8fa-650192fcd178.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6076e77-6bbe-422f-b8fa-650192fcd178.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.456 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6308ea00-b098-436f-bfe1-6f20038e9b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.456 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-f6076e77-6bbe-422f-b8fa-650192fcd178
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/f6076e77-6bbe-422f-b8fa-650192fcd178.pid.haproxy
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID f6076e77-6bbe-422f-b8fa-650192fcd178
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:56:40 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:40.457 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178', 'env', 'PROCESS_TAG=haproxy-f6076e77-6bbe-422f-b8fa-650192fcd178', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6076e77-6bbe-422f-b8fa-650192fcd178.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.470 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.720 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447000.7196524, d1980092-5559-4d0d-a6cc-184b22110cc4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.721 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] VM Started (Lifecycle Event)#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.744 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.749 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447000.7205303, d1980092-5559-4d0d-a6cc-184b22110cc4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.749 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.770 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.772 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.794 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:56:40 np0005555140 podman[216086]: 2025-12-11 09:56:40.834043802 +0000 UTC m=+0.050893906 container create 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.860 187010 DEBUG nova.compute.manager [req-1784fb4a-813a-4858-826f-58acddf56626 req-9f13982e-ed5e-4c73-9f0b-5aa55ddda81c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.861 187010 DEBUG oslo_concurrency.lockutils [req-1784fb4a-813a-4858-826f-58acddf56626 req-9f13982e-ed5e-4c73-9f0b-5aa55ddda81c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.861 187010 DEBUG oslo_concurrency.lockutils [req-1784fb4a-813a-4858-826f-58acddf56626 req-9f13982e-ed5e-4c73-9f0b-5aa55ddda81c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.862 187010 DEBUG oslo_concurrency.lockutils [req-1784fb4a-813a-4858-826f-58acddf56626 req-9f13982e-ed5e-4c73-9f0b-5aa55ddda81c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.862 187010 DEBUG nova.compute.manager [req-1784fb4a-813a-4858-826f-58acddf56626 req-9f13982e-ed5e-4c73-9f0b-5aa55ddda81c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Processing event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.863 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.870 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.870 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447000.8705413, d1980092-5559-4d0d-a6cc-184b22110cc4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.871 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.875 187010 INFO nova.virt.libvirt.driver [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance spawned successfully.#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.875 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:56:40 np0005555140 systemd[1]: Started libpod-conmon-7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02.scope.
Dec 11 04:56:40 np0005555140 podman[216086]: 2025-12-11 09:56:40.802729817 +0000 UTC m=+0.019579941 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.903 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:40 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:56:40 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b004498cd56dcdff757b472a1f71b35de974e9e5cd5304a7ab813bc40ac459f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.910 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.914 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.914 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.915 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.915 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.916 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.916 187010 DEBUG nova.virt.libvirt.driver [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:56:40 np0005555140 podman[216086]: 2025-12-11 09:56:40.921243786 +0000 UTC m=+0.138093920 container init 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 04:56:40 np0005555140 podman[216086]: 2025-12-11 09:56:40.926575209 +0000 UTC m=+0.143425313 container start 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 04:56:40 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [NOTICE]   (216105) : New worker (216107) forked
Dec 11 04:56:40 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [NOTICE]   (216105) : Loading success.
Dec 11 04:56:40 np0005555140 nova_compute[187006]: 2025-12-11 09:56:40.959 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.001 187010 INFO nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Took 8.98 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.002 187010 DEBUG nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.076 187010 INFO nova.compute.manager [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Took 10.00 seconds to build instance.#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.092 187010 DEBUG oslo_concurrency.lockutils [None req-4a384b90-5286-48af-9870-1dfba4ebbeb6 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.112 187010 DEBUG nova.network.neutron [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated VIF entry in instance network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.113 187010 DEBUG nova.network.neutron [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.128 187010 DEBUG oslo_concurrency.lockutils [req-a254b840-97fa-4eff-8896-f3c89014b0bc req-2e91bfcb-e231-47c6-a579-e88321d47865 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:56:41 np0005555140 nova_compute[187006]: 2025-12-11 09:56:41.762 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.956 187010 DEBUG nova.compute.manager [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.959 187010 DEBUG oslo_concurrency.lockutils [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.959 187010 DEBUG oslo_concurrency.lockutils [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.959 187010 DEBUG oslo_concurrency.lockutils [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.960 187010 DEBUG nova.compute.manager [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:56:42 np0005555140 nova_compute[187006]: 2025-12-11 09:56:42.960 187010 WARNING nova.compute.manager [req-0ec7023b-11a2-470e-ab55-1661c2454e6a req-e40a9a06-6fc3-4cd3-9d09-fcbe8dc9786c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:56:44 np0005555140 nova_compute[187006]: 2025-12-11 09:56:44.620 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:44 np0005555140 podman[216116]: 2025-12-11 09:56:44.700352952 +0000 UTC m=+0.071862736 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:56:46 np0005555140 nova_compute[187006]: 2025-12-11 09:56:46.765 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:46Z|00087|binding|INFO|Releasing lport be72e3a5-1e58-485d-bd82-34fa36f9e281 from this chassis (sb_readonly=0)
Dec 11 04:56:46 np0005555140 nova_compute[187006]: 2025-12-11 09:56:46.786 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:46 np0005555140 NetworkManager[55531]: <info>  [1765447006.7920] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Dec 11 04:56:46 np0005555140 NetworkManager[55531]: <info>  [1765447006.7930] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec 11 04:56:46 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:46Z|00088|binding|INFO|Releasing lport be72e3a5-1e58-485d-bd82-34fa36f9e281 from this chassis (sb_readonly=0)
Dec 11 04:56:46 np0005555140 nova_compute[187006]: 2025-12-11 09:56:46.820 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:46 np0005555140 nova_compute[187006]: 2025-12-11 09:56:46.824 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.124 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:47 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:47.125 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:56:47 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:47.127 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.199 187010 DEBUG nova.compute.manager [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.199 187010 DEBUG nova.compute.manager [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing instance network info cache due to event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.199 187010 DEBUG oslo_concurrency.lockutils [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.200 187010 DEBUG oslo_concurrency.lockutils [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:56:47 np0005555140 nova_compute[187006]: 2025-12-11 09:56:47.200 187010 DEBUG nova.network.neutron [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:56:47 np0005555140 podman[216141]: 2025-12-11 09:56:47.701659994 +0000 UTC m=+0.065106703 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm)
Dec 11 04:56:47 np0005555140 podman[216140]: 2025-12-11 09:56:47.745520289 +0000 UTC m=+0.104816599 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:56:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:48.129 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:56:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:48.622 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:48.623 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:56:48.623 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:48 np0005555140 nova_compute[187006]: 2025-12-11 09:56:48.971 187010 DEBUG nova.network.neutron [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated VIF entry in instance network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:56:48 np0005555140 nova_compute[187006]: 2025-12-11 09:56:48.972 187010 DEBUG nova.network.neutron [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:56:48 np0005555140 nova_compute[187006]: 2025-12-11 09:56:48.989 187010 DEBUG oslo_concurrency.lockutils [req-08fa38fa-4571-4cb6-b248-be375d5b2960 req-8d7854b4-b79a-4c4c-b401-44b31af4df1c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:56:49 np0005555140 nova_compute[187006]: 2025-12-11 09:56:49.622 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:51 np0005555140 nova_compute[187006]: 2025-12-11 09:56:51.767 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:52 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:52Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:96:8b:66 10.100.0.8
Dec 11 04:56:52 np0005555140 ovn_controller[95438]: 2025-12-11T09:56:52Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:8b:66 10.100.0.8
Dec 11 04:56:52 np0005555140 nova_compute[187006]: 2025-12-11 09:56:52.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:52 np0005555140 nova_compute[187006]: 2025-12-11 09:56:52.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:53 np0005555140 nova_compute[187006]: 2025-12-11 09:56:53.840 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:53 np0005555140 nova_compute[187006]: 2025-12-11 09:56:53.863 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:53 np0005555140 nova_compute[187006]: 2025-12-11 09:56:53.863 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.624 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.840 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.840 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.841 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.865 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.865 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.865 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.866 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 04:56:54 np0005555140 nova_compute[187006]: 2025-12-11 09:56:54.882 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 04:56:55 np0005555140 nova_compute[187006]: 2025-12-11 09:56:55.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:55 np0005555140 nova_compute[187006]: 2025-12-11 09:56:55.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.769 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.854 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.855 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.855 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.856 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:56:56 np0005555140 nova_compute[187006]: 2025-12-11 09:56:56.928 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:56 np0005555140 podman[216200]: 2025-12-11 09:56:56.990934772 +0000 UTC m=+0.080971977 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.011 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.013 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.070 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.213 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.214 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5535MB free_disk=73.29983139038086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.214 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.215 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.355 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance d1980092-5559-4d0d-a6cc-184b22110cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.355 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.355 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.517 187010 INFO nova.compute.manager [None req-e94b4848-3149-4e00-80bc-7eadd5fe9037 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Get console output#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.520 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.526 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.533 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.552 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:56:57 np0005555140 nova_compute[187006]: 2025-12-11 09:56:57.552 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:56:58 np0005555140 nova_compute[187006]: 2025-12-11 09:56:58.553 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:56:58 np0005555140 nova_compute[187006]: 2025-12-11 09:56:58.554 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:56:59 np0005555140 nova_compute[187006]: 2025-12-11 09:56:59.626 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:56:59 np0005555140 nova_compute[187006]: 2025-12-11 09:56:59.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:57:01 np0005555140 nova_compute[187006]: 2025-12-11 09:57:01.773 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:04 np0005555140 nova_compute[187006]: 2025-12-11 09:57:04.630 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:05 np0005555140 podman[216231]: 2025-12-11 09:57:05.680683162 +0000 UTC m=+0.059427530 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 04:57:05 np0005555140 podman[216232]: 2025-12-11 09:57:05.689161135 +0000 UTC m=+0.061764678 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 04:57:05 np0005555140 nova_compute[187006]: 2025-12-11 09:57:05.854 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:05 np0005555140 nova_compute[187006]: 2025-12-11 09:57:05.855 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:05 np0005555140 nova_compute[187006]: 2025-12-11 09:57:05.856 187010 DEBUG nova.objects.instance [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'flavor' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:06 np0005555140 nova_compute[187006]: 2025-12-11 09:57:06.664 187010 DEBUG nova.objects.instance [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_requests' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:06 np0005555140 nova_compute[187006]: 2025-12-11 09:57:06.683 187010 DEBUG nova.network.neutron [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:57:06 np0005555140 nova_compute[187006]: 2025-12-11 09:57:06.775 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:06 np0005555140 nova_compute[187006]: 2025-12-11 09:57:06.850 187010 DEBUG nova.policy [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:57:07 np0005555140 nova_compute[187006]: 2025-12-11 09:57:07.859 187010 DEBUG nova.network.neutron [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Successfully created port: 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:57:08 np0005555140 nova_compute[187006]: 2025-12-11 09:57:08.995 187010 DEBUG nova.network.neutron [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Successfully updated port: 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.018 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.018 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.018 187010 DEBUG nova.network.neutron [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.120 187010 DEBUG nova.compute.manager [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-changed-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.120 187010 DEBUG nova.compute.manager [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing instance network info cache due to event network-changed-435350c0-ea0e-4c79-8bc4-70f3e12c60c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.120 187010 DEBUG oslo_concurrency.lockutils [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:09 np0005555140 nova_compute[187006]: 2025-12-11 09:57:09.632 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:10 np0005555140 podman[216271]: 2025-12-11 09:57:10.689839135 +0000 UTC m=+0.061459966 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 04:57:11 np0005555140 nova_compute[187006]: 2025-12-11 09:57:11.729 187010 DEBUG nova.network.neutron [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:11 np0005555140 nova_compute[187006]: 2025-12-11 09:57:11.776 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.229 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.230 187010 DEBUG oslo_concurrency.lockutils [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.230 187010 DEBUG nova.network.neutron [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing network info cache for port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.232 187010 DEBUG nova.virt.libvirt.vif [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.233 187010 DEBUG nova.network.os_vif_util [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.233 187010 DEBUG nova.network.os_vif_util [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.234 187010 DEBUG os_vif [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.234 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.234 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.235 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.237 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.237 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap435350c0-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.237 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap435350c0-ea, col_values=(('external_ids', {'iface-id': '435350c0-ea0e-4c79-8bc4-70f3e12c60c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:5b:f8', 'vm-uuid': 'd1980092-5559-4d0d-a6cc-184b22110cc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.239 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.2400] manager: (tap435350c0-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.240 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.249 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.250 187010 INFO os_vif [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea')#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.251 187010 DEBUG nova.virt.libvirt.vif [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.251 187010 DEBUG nova.network.os_vif_util [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.252 187010 DEBUG nova.network.os_vif_util [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.257 187010 DEBUG nova.virt.libvirt.guest [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] attach device xml: <interface type="ethernet">
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:ca:5b:f8"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <target dev="tap435350c0-ea"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:57:12 np0005555140 nova_compute[187006]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec 11 04:57:12 np0005555140 kernel: tap435350c0-ea: entered promiscuous mode
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.2697] manager: (tap435350c0-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Dec 11 04:57:12 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:12Z|00089|binding|INFO|Claiming lport 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 for this chassis.
Dec 11 04:57:12 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:12Z|00090|binding|INFO|435350c0-ea0e-4c79-8bc4-70f3e12c60c2: Claiming fa:16:3e:ca:5b:f8 10.100.0.25
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.270 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.278 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:5b:f8 10.100.0.25'], port_security=['fa:16:3e:ca:5b:f8 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=435350c0-ea0e-4c79-8bc4-70f3e12c60c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.279 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 bound to our chassis#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.280 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.293 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf36d6d-bc77-4dde-b8fc-a3fa61435d90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.294 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac1787f0-41 in ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.296 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac1787f0-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.296 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f889434f-f26b-4e18-8d35-c2e3020ce1ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.297 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[40f4b222-71d1-476c-810b-2bec38f2e0e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.304 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:12Z|00091|binding|INFO|Setting lport 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 ovn-installed in OVS
Dec 11 04:57:12 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:12Z|00092|binding|INFO|Setting lport 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 up in Southbound
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.308 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 systemd-udevd[216299]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.310 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[79694094-2517-4c20-86ad-0938e4e4888b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.3200] device (tap435350c0-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.3212] device (tap435350c0-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.332 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[bc94eb9f-c251-4112-83c8-e8129bf22742]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.363 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[f62b5520-1dc3-4aac-842c-8520c8bbf414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.369 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[adba41fc-d60b-4b5b-97cf-d121aa4d713d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.3703] manager: (tapac1787f0-40): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.396 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b5f593-6143-4110-84ce-1128de3ebb5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.399 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[acb01cb2-dc80-4256-826f-6e6156eeb0a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.4171] device (tapac1787f0-40): carrier: link connected
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.422 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[62f9c13f-3d7c-4772-99af-bcafd15154db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.436 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[c5864d59-d65a-4aaa-81be-43b35ba4d765]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216326, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.450 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[41e8955c-d369-44d2-aaf1-4c1d42f66f94]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:5b56'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339098, 'tstamp': 339098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216327, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.466 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb1b359-5fdf-4642-85f9-bb678747687d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216328, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.481 187010 DEBUG nova.virt.libvirt.driver [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.482 187010 DEBUG nova.virt.libvirt.driver [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.482 187010 DEBUG nova.virt.libvirt.driver [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:96:8b:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.483 187010 DEBUG nova.virt.libvirt.driver [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:ca:5b:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.491 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a927c8-c283-4339-a1f9-8e3e02a41c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.509 187010 DEBUG nova.virt.libvirt.guest [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:57:12</nova:creationTime>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:57:12 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    <nova:port uuid="435350c0-ea0e-4c79-8bc4-70f3e12c60c2">
Dec 11 04:57:12 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:57:12 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:57:12 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:57:12 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.533 187010 DEBUG oslo_concurrency.lockutils [None req-262dea81-5561-4a00-a699-9f2e101c1a4b 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.544 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5607397a-f438-4bf9-81a4-f4b263877b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.545 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.545 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.546 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac1787f0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.547 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 kernel: tapac1787f0-40: entered promiscuous mode
Dec 11 04:57:12 np0005555140 NetworkManager[55531]: <info>  [1765447032.5495] manager: (tapac1787f0-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.549 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.550 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac1787f0-40, col_values=(('external_ids', {'iface-id': '2448c0c3-3800-4261-a3ca-d11be99e9c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.551 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:12Z|00093|binding|INFO|Releasing lport 2448c0c3-3800-4261-a3ca-d11be99e9c3d from this chassis (sb_readonly=0)
Dec 11 04:57:12 np0005555140 nova_compute[187006]: 2025-12-11 09:57:12.562 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.563 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac1787f0-46c0-4ede-92c6-bec41e2a3f91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac1787f0-46c0-4ede-92c6-bec41e2a3f91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.564 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[91527cc0-8aed-41ca-8bdc-090ffbfaaee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.564 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-ac1787f0-46c0-4ede-92c6-bec41e2a3f91
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/ac1787f0-46c0-4ede-92c6-bec41e2a3f91.pid.haproxy
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID ac1787f0-46c0-4ede-92c6-bec41e2a3f91
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:57:12 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:12.565 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'env', 'PROCESS_TAG=haproxy-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac1787f0-46c0-4ede-92c6-bec41e2a3f91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:57:12 np0005555140 podman[216360]: 2025-12-11 09:57:12.963309107 +0000 UTC m=+0.073000226 container create 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 11 04:57:12 np0005555140 systemd[1]: Started libpod-conmon-1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1.scope.
Dec 11 04:57:13 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:57:13 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29b849e4770ca2c6b6e52bd7128140826e288c627769f81f53ba7c5161a0babc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:57:13 np0005555140 podman[216360]: 2025-12-11 09:57:12.932229489 +0000 UTC m=+0.041920588 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:57:13 np0005555140 podman[216360]: 2025-12-11 09:57:13.028754596 +0000 UTC m=+0.138445685 container init 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 04:57:13 np0005555140 podman[216360]: 2025-12-11 09:57:13.040240324 +0000 UTC m=+0.149931393 container start 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 04:57:13 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [NOTICE]   (216379) : New worker (216381) forked
Dec 11 04:57:13 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [NOTICE]   (216379) : Loading success.
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.188 187010 DEBUG nova.compute.manager [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.189 187010 DEBUG oslo_concurrency.lockutils [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.189 187010 DEBUG oslo_concurrency.lockutils [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.189 187010 DEBUG oslo_concurrency.lockutils [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.189 187010 DEBUG nova.compute.manager [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:57:13 np0005555140 nova_compute[187006]: 2025-12-11 09:57:13.189 187010 WARNING nova.compute.manager [req-a1408c29-6e0b-4ebc-88b4-80e03eb3461d req-5df10f0d-8d55-4b63-87f5-9022c17f0ca2 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:57:13 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:13Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ca:5b:f8 10.100.0.25
Dec 11 04:57:13 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:13Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ca:5b:f8 10.100.0.25
Dec 11 04:57:14 np0005555140 nova_compute[187006]: 2025-12-11 09:57:14.110 187010 DEBUG nova.network.neutron [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated VIF entry in instance network info cache for port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:57:14 np0005555140 nova_compute[187006]: 2025-12-11 09:57:14.111 187010 DEBUG nova.network.neutron [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:14 np0005555140 nova_compute[187006]: 2025-12-11 09:57:14.130 187010 DEBUG oslo_concurrency.lockutils [req-7d9a7762-39f2-4cad-a985-68f1d0cacf5d req-62884e40-6595-43ac-aa19-31e7ec296aff b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.277 187010 DEBUG nova.compute.manager [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.277 187010 DEBUG oslo_concurrency.lockutils [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.277 187010 DEBUG oslo_concurrency.lockutils [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.278 187010 DEBUG oslo_concurrency.lockutils [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.278 187010 DEBUG nova.compute.manager [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.278 187010 WARNING nova.compute.manager [req-5fc84cc6-9814-4d31-9ec8-ccbf37d31b64 req-8df79b6f-3a08-4e37-b439-d822bf0326f3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:57:15 np0005555140 podman[216390]: 2025-12-11 09:57:15.706179646 +0000 UTC m=+0.068756365 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.740 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.763 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Triggering sync for uuid d1980092-5559-4d0d-a6cc-184b22110cc4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.764 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.765 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:15 np0005555140 nova_compute[187006]: 2025-12-11 09:57:15.820 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:16 np0005555140 nova_compute[187006]: 2025-12-11 09:57:16.778 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:17 np0005555140 nova_compute[187006]: 2025-12-11 09:57:17.239 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:18 np0005555140 podman[216414]: 2025-12-11 09:57:18.685677035 +0000 UTC m=+0.056445323 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc.)
Dec 11 04:57:18 np0005555140 podman[216413]: 2025-12-11 09:57:18.7030104 +0000 UTC m=+0.076635080 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.264 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.264 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.287 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.373 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.374 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.384 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.385 187010 INFO nova.compute.claims [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.493 187010 DEBUG nova.compute.provider_tree [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.513 187010 DEBUG nova.scheduler.client.report [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.532 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.533 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.576 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.576 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.597 187010 INFO nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.615 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.693 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.695 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.696 187010 INFO nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Creating image(s)#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.697 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.698 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.699 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.727 187010 DEBUG nova.policy [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.731 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.780 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.797 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.798 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.799 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.812 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.867 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.868 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.899 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.900 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.900 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.953 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.954 187010 DEBUG nova.virt.disk.api [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:57:21 np0005555140 nova_compute[187006]: 2025-12-11 09:57:21.955 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.010 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.011 187010 DEBUG nova.virt.disk.api [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.011 187010 DEBUG nova.objects.instance [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 72b08ac6-c357-4b0b-8c3f-05df9e439a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.024 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.025 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Ensure instance console log exists: /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.025 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.026 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.026 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:22 np0005555140 nova_compute[187006]: 2025-12-11 09:57:22.241 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:25 np0005555140 nova_compute[187006]: 2025-12-11 09:57:25.767 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Successfully created port: fc691b86-f650-4fef-ba47-39bbfc1eeff8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.785 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.792 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Successfully updated port: fc691b86-f650-4fef-ba47-39bbfc1eeff8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.805 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.805 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.806 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.880 187010 DEBUG nova.compute.manager [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-changed-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.881 187010 DEBUG nova.compute.manager [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Refreshing instance network info cache due to event network-changed-fc691b86-f650-4fef-ba47-39bbfc1eeff8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.881 187010 DEBUG oslo_concurrency.lockutils [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:26 np0005555140 nova_compute[187006]: 2025-12-11 09:57:26.941 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:57:27 np0005555140 nova_compute[187006]: 2025-12-11 09:57:27.243 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:27 np0005555140 podman[216472]: 2025-12-11 09:57:27.680841555 +0000 UTC m=+0.056869295 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 04:57:27 np0005555140 nova_compute[187006]: 2025-12-11 09:57:27.986 187010 DEBUG nova.network.neutron [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Updating instance_info_cache with network_info: [{"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.016 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.017 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Instance network_info: |[{"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.017 187010 DEBUG oslo_concurrency.lockutils [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.018 187010 DEBUG nova.network.neutron [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Refreshing network info cache for port fc691b86-f650-4fef-ba47-39bbfc1eeff8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.020 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Start _get_guest_xml network_info=[{"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.024 187010 WARNING nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.028 187010 DEBUG nova.virt.libvirt.host [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.029 187010 DEBUG nova.virt.libvirt.host [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.037 187010 DEBUG nova.virt.libvirt.host [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.038 187010 DEBUG nova.virt.libvirt.host [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.038 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.039 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.039 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.039 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.039 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.040 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.040 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.040 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.040 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.041 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.041 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.041 187010 DEBUG nova.virt.hardware [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.046 187010 DEBUG nova.virt.libvirt.vif [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:57:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1623364696',display_name='tempest-TestNetworkBasicOps-server-1623364696',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1623364696',id=7,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIeHb0/t1D0SUyGLgaiIPGcB+qu02xGrxqrZw0NDgzlKndSZC7ZFLEhrd1LjC+GpQofYxUoA8F20+3zQUnXezxZH+vFzjqrOwyek358sCEFvhoxJQrpgUML5K8Ufsdpu2Q==',key_name='tempest-TestNetworkBasicOps-833561704',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-0um5fobf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:57:21Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=72b08ac6-c357-4b0b-8c3f-05df9e439a54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.046 187010 DEBUG nova.network.os_vif_util [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.047 187010 DEBUG nova.network.os_vif_util [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.048 187010 DEBUG nova.objects.instance [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 72b08ac6-c357-4b0b-8c3f-05df9e439a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.061 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <uuid>72b08ac6-c357-4b0b-8c3f-05df9e439a54</uuid>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <name>instance-00000007</name>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1623364696</nova:name>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:57:28</nova:creationTime>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        <nova:port uuid="fc691b86-f650-4fef-ba47-39bbfc1eeff8">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="serial">72b08ac6-c357-4b0b-8c3f-05df9e439a54</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="uuid">72b08ac6-c357-4b0b-8c3f-05df9e439a54</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.config"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:82:2a:6e"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <target dev="tapfc691b86-f6"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/console.log" append="off"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:57:28 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:57:28 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:57:28 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:57:28 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.062 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Preparing to wait for external event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.063 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.063 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.063 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.064 187010 DEBUG nova.virt.libvirt.vif [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:57:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1623364696',display_name='tempest-TestNetworkBasicOps-server-1623364696',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1623364696',id=7,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIeHb0/t1D0SUyGLgaiIPGcB+qu02xGrxqrZw0NDgzlKndSZC7ZFLEhrd1LjC+GpQofYxUoA8F20+3zQUnXezxZH+vFzjqrOwyek358sCEFvhoxJQrpgUML5K8Ufsdpu2Q==',key_name='tempest-TestNetworkBasicOps-833561704',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-0um5fobf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:57:21Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=72b08ac6-c357-4b0b-8c3f-05df9e439a54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.064 187010 DEBUG nova.network.os_vif_util [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.065 187010 DEBUG nova.network.os_vif_util [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.065 187010 DEBUG os_vif [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.066 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.066 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.066 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.068 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.069 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc691b86-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.069 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc691b86-f6, col_values=(('external_ids', {'iface-id': 'fc691b86-f650-4fef-ba47-39bbfc1eeff8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:2a:6e', 'vm-uuid': '72b08ac6-c357-4b0b-8c3f-05df9e439a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.070 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:28 np0005555140 NetworkManager[55531]: <info>  [1765447048.0714] manager: (tapfc691b86-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.072 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.081 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.082 187010 INFO os_vif [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6')#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.136 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.137 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.137 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:82:2a:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.138 187010 INFO nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Using config drive#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.835 187010 INFO nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Creating config drive at /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.config#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.841 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpntx5lkkq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:57:28 np0005555140 nova_compute[187006]: 2025-12-11 09:57:28.968 187010 DEBUG oslo_concurrency.processutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpntx5lkkq" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:57:29 np0005555140 NetworkManager[55531]: <info>  [1765447049.0197] manager: (tapfc691b86-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Dec 11 04:57:29 np0005555140 kernel: tapfc691b86-f6: entered promiscuous mode
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.021 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:29Z|00094|binding|INFO|Claiming lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 for this chassis.
Dec 11 04:57:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:29Z|00095|binding|INFO|fc691b86-f650-4fef-ba47-39bbfc1eeff8: Claiming fa:16:3e:82:2a:6e 10.100.0.22
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.031 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:2a:6e 10.100.0.22'], port_security=['fa:16:3e:82:2a:6e 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c35273a-1805-4786-9147-92f6f3e7516e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=fc691b86-f650-4fef-ba47-39bbfc1eeff8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.033 104288 INFO neutron.agent.ovn.metadata.agent [-] Port fc691b86-f650-4fef-ba47-39bbfc1eeff8 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 bound to our chassis#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.034 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91#033[00m
Dec 11 04:57:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:29Z|00096|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 ovn-installed in OVS
Dec 11 04:57:29 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:29Z|00097|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 up in Southbound
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.040 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.043 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:29 np0005555140 systemd-udevd[216517]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.051 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[75092f95-f09a-43ea-876e-8cbd925feaba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 systemd-machined[153398]: New machine qemu-7-instance-00000007.
Dec 11 04:57:29 np0005555140 NetworkManager[55531]: <info>  [1765447049.0646] device (tapfc691b86-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:57:29 np0005555140 NetworkManager[55531]: <info>  [1765447049.0652] device (tapfc691b86-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:57:29 np0005555140 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.082 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2c969b-b883-481e-8f79-5a1d9b7acef5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.085 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[212d5568-d719-4eaa-91f7-00219624ee90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.116 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[84cc4e0d-daf0-4b9e-9fe5-999b8344f703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.132 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1729e42d-3cd0-451a-bae8-58146a4ce009]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216529, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.145 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[07897aec-e7a1-488f-9ea4-5fe4a4bc84ed]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339108, 'tstamp': 339108}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216531, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339110, 'tstamp': 339110}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216531, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.147 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.149 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.150 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac1787f0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.150 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.151 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac1787f0-40, col_values=(('external_ids', {'iface-id': '2448c0c3-3800-4261-a3ca-d11be99e9c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:29 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:29.151 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.199 187010 DEBUG nova.compute.manager [req-976480cc-9d3e-4b91-9e17-250be3239453 req-5568f29a-3c38-43f7-b90a-5fdad82e6bd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.200 187010 DEBUG oslo_concurrency.lockutils [req-976480cc-9d3e-4b91-9e17-250be3239453 req-5568f29a-3c38-43f7-b90a-5fdad82e6bd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.200 187010 DEBUG oslo_concurrency.lockutils [req-976480cc-9d3e-4b91-9e17-250be3239453 req-5568f29a-3c38-43f7-b90a-5fdad82e6bd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.200 187010 DEBUG oslo_concurrency.lockutils [req-976480cc-9d3e-4b91-9e17-250be3239453 req-5568f29a-3c38-43f7-b90a-5fdad82e6bd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.201 187010 DEBUG nova.compute.manager [req-976480cc-9d3e-4b91-9e17-250be3239453 req-5568f29a-3c38-43f7-b90a-5fdad82e6bd7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Processing event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.386 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447049.3860664, 72b08ac6-c357-4b0b-8c3f-05df9e439a54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.387 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] VM Started (Lifecycle Event)#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.390 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.394 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.397 187010 INFO nova.virt.libvirt.driver [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Instance spawned successfully.#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.397 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.412 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.418 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.423 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.423 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.424 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.425 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.425 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.426 187010 DEBUG nova.virt.libvirt.driver [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.433 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.434 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447049.3863094, 72b08ac6-c357-4b0b-8c3f-05df9e439a54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.434 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.474 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.478 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447049.3928425, 72b08ac6-c357-4b0b-8c3f-05df9e439a54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.479 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.496 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.499 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.502 187010 INFO nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Took 7.81 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.502 187010 DEBUG nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.530 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.559 187010 INFO nova.compute.manager [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Took 8.22 seconds to build instance.#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.575 187010 DEBUG oslo_concurrency.lockutils [None req-e58c6e6b-9e02-4cea-9582-b49853beadb0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.721 187010 DEBUG nova.network.neutron [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Updated VIF entry in instance network info cache for port fc691b86-f650-4fef-ba47-39bbfc1eeff8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.722 187010 DEBUG nova.network.neutron [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Updating instance_info_cache with network_info: [{"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:29 np0005555140 nova_compute[187006]: 2025-12-11 09:57:29.737 187010 DEBUG oslo_concurrency.lockutils [req-c6c0c99f-15c9-42de-a1e0-f302b0f50200 req-70ba42ff-19e4-4592-87f7-535e194a2437 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-72b08ac6-c357-4b0b-8c3f-05df9e439a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.278 187010 DEBUG nova.compute.manager [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.279 187010 DEBUG oslo_concurrency.lockutils [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.279 187010 DEBUG oslo_concurrency.lockutils [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.279 187010 DEBUG oslo_concurrency.lockutils [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.279 187010 DEBUG nova.compute.manager [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] No waiting events found dispatching network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.280 187010 WARNING nova.compute.manager [req-44cb9c1b-45e0-4336-91f4-251a5b5d4e95 req-85cd91e6-7122-4324-a2ed-b78c413f1b9a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received unexpected event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:57:31 np0005555140 nova_compute[187006]: 2025-12-11 09:57:31.787 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:33 np0005555140 nova_compute[187006]: 2025-12-11 09:57:33.072 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:36 np0005555140 podman[216539]: 2025-12-11 09:57:36.687923497 +0000 UTC m=+0.058795301 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 04:57:36 np0005555140 podman[216540]: 2025-12-11 09:57:36.707192177 +0000 UTC m=+0.077412202 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 11 04:57:36 np0005555140 nova_compute[187006]: 2025-12-11 09:57:36.790 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:38 np0005555140 nova_compute[187006]: 2025-12-11 09:57:38.075 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:41 np0005555140 podman[216594]: 2025-12-11 09:57:41.731446184 +0000 UTC m=+0.085723849 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 11 04:57:41 np0005555140 nova_compute[187006]: 2025-12-11 09:57:41.791 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:43 np0005555140 nova_compute[187006]: 2025-12-11 09:57:43.078 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:43 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:43Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:2a:6e 10.100.0.22
Dec 11 04:57:43 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:43Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:2a:6e 10.100.0.22
Dec 11 04:57:46 np0005555140 podman[216613]: 2025-12-11 09:57:46.68877517 +0000 UTC m=+0.060790477 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:57:46 np0005555140 nova_compute[187006]: 2025-12-11 09:57:46.793 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:48 np0005555140 nova_compute[187006]: 2025-12-11 09:57:48.081 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:48.623 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:48.625 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:49 np0005555140 podman[216640]: 2025-12-11 09:57:49.242410535 +0000 UTC m=+0.075593840 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 04:57:49 np0005555140 podman[216639]: 2025-12-11 09:57:49.256272521 +0000 UTC m=+0.103022613 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.168 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'name': 'tempest-TestNetworkBasicOps-server-1623364696', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.171 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'name': 'tempest-TestNetworkBasicOps-server-1422762768', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.207 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.latency volume: 190408288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.207 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.latency volume: 23594368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.240 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.latency volume: 197197218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.241 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.latency volume: 25800021 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb8c086c-956b-4531-95d4-97a339a9dc96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 190408288, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.172254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6648d6-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': 'cb38a0e721049a218b2fd8a74d8c7b2027c3bdc011f9d4e01dcd59189eb28023'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23594368, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.172254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6654ac-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '472a7eae4c109221e6b180782eeab00a7feb8192c9d63c11b73cc066a283b986'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197197218, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.172254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6b6a50-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '6d91ac2bdec0f6e97abaaacf07fddbd3f8c078cff1f3fbf4e2775d3628e71a77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25800021, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.172254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6b7702-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '9e3e7b07d7f8f9db333fdbd61e8de02d97ca3848bad48e4e0d10da8e44297034'}]}, 'timestamp': '2025-12-11 09:57:50.241508', '_unique_id': 'e9c7cab9d04b4fc7bc2276afcf85c569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.247 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 72b08ac6-c357-4b0b-8c3f-05df9e439a54 / tapfc691b86-f6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.248 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.250 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d1980092-5559-4d0d-a6cc-184b22110cc4 / tap3787f1e5-cc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.250 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d1980092-5559-4d0d-a6cc-184b22110cc4 / tap435350c0-ea inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.251 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.251 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '461b0973-f361-4826-ac58-5b89f8a5c07a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.243928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da6c87c8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': '7038c634fd0d04aca3dad9282a27a91916a08707f624dbf2a3fc0d0533a8a41a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.243928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da6cf62c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'e3796edcb7eeb7bc9aad8f06232f6d03cd16e0a6536b68f053aaa42c8aebc3e8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.243928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da6d00ae-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '2a6191a5b586f5a686fdddc00bb867a25114b83a50639c977d0776e274b21a0e'}]}, 'timestamp': '2025-12-11 09:57:50.251574', '_unique_id': 'da54835091d945828fd9a63c4e3be27f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>]
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.253 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.254 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.254 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7107e291-3d89-4ba5-ad50-e97952f8c988', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.253551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6d5842-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': 'ed22aae44a47e5e1f4f2947884affe189504ef86f4148d9a1a704b67eb047d7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.253551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6d6198-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '10479964afb0336a2ee4f3b2044b1026e30d62383af8a8dd99c899aae1a7484e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.253551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6d694a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '99fb87b1ac3d4f87692aef94374738e26b9623e39c11d5f887b4843c3f09207f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.253551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6d7278-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': 'b6341f2bf78a797bedc49f1ca9bfa1fb8906f41aafbf7122f1f9d99fdc3fb572'}]}, 'timestamp': '2025-12-11 09:57:50.254468', '_unique_id': 'a3e82bec90c14d5f93550a3ba0674bca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.255 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>]
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.bytes volume: 73084928 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.256 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34bf777d-2ec0-431e-8dea-a2d35335b267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.256182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6dbeae-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': 'b8be395d712d41adbe983d228d5ecd7a9b569fce8d957831e7a1a95b426f0b6a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.256182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6dc692-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '3d314653dfd83424aa0765a995579405b9e179ef643864529a1cdddf4df91471'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73084928, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.256182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6dcf52-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '2334304c0e3aa197c743cb6a2e336e69eb4ad4fb18be62701220706376e8fb5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.256182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6dd9a2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '4546505faead8c5a886341f15e065c82a2c96093e5c986885f347d4eaa776ecf'}]}, 'timestamp': '2025-12-11 09:57:50.257097', '_unique_id': '74192c707a6248519f68368aa5ffd1a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.258 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.258 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.258 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fcdbc39-c27e-4e3e-ae2e-a3a4427b9c15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.258389', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da6e1570-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': 'd398dbaef3d0d21f1c038e8e5d44f0c30b014100ec569a73fa8a6e348c5aaa55'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.258389', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da6e1f66-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '4275005238b4d20918a0ea30224227fad137c72c3f699bfd7a56813f1fda8706'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.258389', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da6e2a4c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'edb95ecd1bf8257ab300e1adc802d626577ba37f554578ab54fd88a12cb75294'}]}, 'timestamp': '2025-12-11 09:57:50.259174', '_unique_id': 'ca82660a83b24a13b1eb1bc7ecb75810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.260 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.latency volume: 21052336576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.260 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.260 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.latency volume: 6325742635 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e47b548-2bd4-46f1-8eec-2f02e876d2d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052336576, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.260327', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6e61b0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '83e6e9442b50b158e32a39eed128b0a490ba67000b56975ad1f883334bdb84ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.260327', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6e6b06-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '12c83c6e100cc3b8db040df6943cf893a4d3a9f0377c6413bd11f76c1f743f59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6325742635, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.260327', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6e74e8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '1699dc110d0e1346bccf3bcf820c1930df249e98432e55f47a65a81575ac6c30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.260327', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6e7e2a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': 'aaca30c3255e498e657e581136434dbc5efdf1e147ef1d23dd15f99f1303db15'}]}, 'timestamp': '2025-12-11 09:57:50.261313', '_unique_id': 'd3deb59d0e8042478cf4252dffb89075'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.261 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.262 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.262 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.263 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcbfcb20-5b23-4c30-bce0-a380b44fe8eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.262686', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da6ebde0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': '205307a6d77d68601615c57697d8df993a2775f695aaf2bc032c1b0d304840e5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.262686', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da6ec916-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'b7484e0421ce0c7863a1f133db8e17b4fe6dbfd1eccd09a9fd4b0af5df2b2573'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.262686', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da6ed258-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '6b247dc784743c5a9f125520024a0077c8aae6b92fd5cd17e51b05885101f241'}]}, 'timestamp': '2025-12-11 09:57:50.263491', '_unique_id': '8e0e985af9194deeb5ed7d197c35676d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.264 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>]
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.265 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.requests volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.265 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.265 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.requests volume: 333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.265 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e67a91b4-c204-49b5-8bd0-1b9c313176f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 300, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.265172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6f1e84-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '968da6d0ddb8c877f0c8eea3ceef6fa7b7391f2aa827c47e39f7b421727462ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.265172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6f27bc-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': 'bc64ce70aca237c63c97f731223b239e5b2cbee495da384f2ec7a1876bcebb62'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 333, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.265172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6f3130-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': 'b16195d6aaa641e9014b4dda8a2332c62ef8fe283ba93b426950b766ba4e76b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.265172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6f3c0c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '8f0c9fc8ba018d0954013df9bec107220c185d7988e93bc10ebef029acbdd2cc'}]}, 'timestamp': '2025-12-11 09:57:50.266188', '_unique_id': '5327ef8d8b4346d8aa4257569bad7967'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.266 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.268 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.280 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.280 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.292 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.292 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bf60a04-d9bf-4636-9fac-9213e825f1bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.268940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da71749a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': '0ea88f992c40ae49d00337f4d7c4a73e74455c9e1181d2b889c4afb848f6f892'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.268940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da717f12-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': 'ccb4f0eb7d0296b51ff1329f8cb3d497c965be9e4206c6660f6f89dd47af70c0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.268940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da734b1c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': 'f4ef0bce77d2ca03095e33bf518298bdabd2a474f80be385680eb4e3f3fb38a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.268940', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7354b8-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': 'a749919cc6135016f4d946d1cea31e93c2fa2bc8bd1d2c8e1990aec2a19480d2'}]}, 'timestamp': '2025-12-11 09:57:50.293017', '_unique_id': 'fe3fa45cc1a647bab0bf37c27d20d93b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.293 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.294 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.294 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac623b85-4f6d-4b55-b191-46451d32ad5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.294712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da73a24c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': '92ed75819ed5fff4b5a06400b32f67e2969068e823d73db3c19473790b1b9567'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 308, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.294712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da73ab0c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '69c87302d910e683e92feec793466174d0a06d1dfeb4d44df6e51359e62db467'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.294712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da73b2f0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'd774bc1847800ee1d5aa66bb76fbabc82fbf4da1bb917e70de4245d90a276859'}]}, 'timestamp': '2025-12-11 09:57:50.295421', '_unique_id': 'e4a5b5be2b0c463b99476079d9be25d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.295 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.296 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.296 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.297 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f8b3208-6052-4351-97f5-ba19107ff201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.296601', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da73e9a0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': '0a03ae92ed88012d4418c4f5211b8113e2902439a2cc3f1647d9054f7e479d20'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.296601', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da73f2f6-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '53411e7cd3150acdd29a8e829ea477878f71f93a0c0052ae27221173c39c77b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.296601', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da73fe04-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '7352af938815670a1db75c28a8a256e1677090deffa96a5fdb572a7ca3b5c4d8'}]}, 'timestamp': '2025-12-11 09:57:50.297383', '_unique_id': 'efd9b7ac5007422fa9a7a93885af0c4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.298 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.299 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.299 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5990ad2-665b-422a-9b9d-41404e0eb6ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.298896', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da744396-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': 'd3cf1ef2e872ada6bef60bf6ac168d6cfbd53b439ac457dbdea7d6412b1163bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.298896', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da744d14-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '0d274ac9b7f919aaf956bd9ae2222533b0d7d3bb2d662aa3fc43e41d810569ae'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.298896', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da7457aa-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '38ee1d14707c74e68bce5ac43521bfb42a81eb226e6bdbb47b5fa9e416ac8f78'}]}, 'timestamp': '2025-12-11 09:57:50.299675', '_unique_id': '57a0351ac8c84eb3a83231b3574373bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.300 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.301 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.301 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets volume: 341 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.301 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2f335f5-fbe9-4fae-ad59-a1520fe482d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.301105', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da749b34-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': 'e00bfe75bd7a595f42d9c35e753bef8e667df4dbd8b3073f587feeebbe1929e7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 341, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.301105', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da74a64c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'b3b31e4a46b32e57832469e5d2fd2745fa334fcf9a2af3b74b8cbba31e2570c2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.301105', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da74b0e2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '0dfae4197fc6a55458b0beef5cb1a141f429e0526a75e171ba5a84d34715962c'}]}, 'timestamp': '2025-12-11 09:57:50.301980', '_unique_id': '164cc6c886bf454ca11e2341fe4c49a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.302 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.303 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.incoming.bytes volume: 2346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.303 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.bytes volume: 59058 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.304 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.incoming.bytes volume: 1596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d9dd474-d22a-48a2-8ecc-b9aad842cbf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2346, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.303584', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da74fbe2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': '7e6a4544f4f6aca6ec3424115c86da49ffbc938bf4c9729ebedf5ef2a9a186cf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 59058, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.303584', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da7507ea-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '424f32fcbb3d79be914b263821af6ac5d2327432489d21e6fd8519c7e75ed036'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1596, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.303584', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da7512b2-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '564d3855eafea544a5e4529d63a6be156c269266982efd0569ff4968cefa0c3f'}]}, 'timestamp': '2025-12-11 09:57:50.304462', '_unique_id': 'ed7a468eb6414725b9a835584dc9dde9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.305 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.321 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/cpu volume: 11070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.338 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/cpu volume: 11540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '868d781a-bced-4e6f-b2fb-80073bd31287', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11070000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'timestamp': '2025-12-11T09:57:50.305906', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da77cbec-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.947081057, 'message_signature': '8b34720fc4eb7aa2a33662b0b7cf5e9a63ab581a02ac6b2f441bca01d848554b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11540000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'timestamp': '2025-12-11T09:57:50.305906', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da7a446c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.963174666, 'message_signature': 'c07c8c5c61a8e5925e5a6530298550da14ae7cdd2ea7a3d027327a10d182c45d'}]}, 'timestamp': '2025-12-11 09:57:50.338552', '_unique_id': '279cdb859ff74b70981bb6924c1ca4e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.339 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.340 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.bytes volume: 30521856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.340 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.341 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.bytes volume: 30333440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.341 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '332fc990-a7f4-4ccb-ac18-4940d56b1bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30521856, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.340474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7a9d22-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': '929b31a9dfec937aa418fbe1a761e172478472e72236793e1131da82c4d6373a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.340474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7aa8d0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.797545316, 'message_signature': 'b3b9d0bcaa1f9c8f9bd364ace96aeff2aa6f8f280d8ad2b3bd73e1f31eec3243'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30333440, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.340474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7ab398-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '0105bd1f6ea9bb30abd617ca75730faf8c82472d5524eb94033d71298c04fe3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.340474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7abe10-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.83303891, 'message_signature': '0567d5feb9cdde93af8e90b99141794d4efdc31c35c703693c70f8d5d5fc43b3'}]}, 'timestamp': '2025-12-11 09:57:50.341612', '_unique_id': '331bf4d28cf64134a782f2851aeacad7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.342 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.343 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.343 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.343 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.343 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '019d96b9-019d-492e-be1d-7c3b92cf282e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.343165', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7b05be-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': '9a25438d7c7e7bac9bfe681122acb611ab00c8537b7b3141663ab7771a05806d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.343165', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7b0ff0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': '27bf7d4dbafbdf83a518b0a1e0670b71b8cf5d9c0ff169153fe3dfe1cefcf84a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.343165', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7b19fa-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': 'd0f9c8fd9a0913c53fe4608a8bbacfbc7f03bda2bc7766a5270281222843bc18'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.343165', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7b2954-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': '80c8caff5f6c66dce6d4f48ec6303cfab0003d3cd0a7262b351b70ef5900dcd0'}]}, 'timestamp': '2025-12-11 09:57:50.344355', '_unique_id': '9924a60d5cf8426fb44777197ae4c4db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.344 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.345 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.346 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.346 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.346 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eeedf8d-a9f5-4cfd-97cd-5a8b6302f77e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-vda', 'timestamp': '2025-12-11T09:57:50.345798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7b6d4c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': '2ac58ad8c27f689089c966f6c4ed550e69349f5294d1c6b25fe3c69c0969dcf0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54-sda', 'timestamp': '2025-12-11T09:57:50.345798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7b774c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.894152015, 'message_signature': 'fe3d9f61dc63adb5a5089da34eaf57d4e12eff01e39c91e172054cb4c4c1fe76'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-vda', 'timestamp': '2025-12-11T09:57:50.345798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da7b828c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': '166f2e312cef4fabf54dbca30598b6e054ff9195ac1ed1e975377c6a507eda30'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4-sda', 'timestamp': '2025-12-11T09:57:50.345798', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da7b8c6e-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.906153298, 'message_signature': '133fbc6e7d27efd3d62546783b91e5e4f6631f9a1c9ca6d5a0faa9fe0b727076'}]}, 'timestamp': '2025-12-11 09:57:50.346919', '_unique_id': 'c7b5bea18a4340388d8692be89e21684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.348 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.outgoing.bytes volume: 1578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.348 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.bytes volume: 52846 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.348 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.bytes volume: 2516 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aa62917-eba1-4e8f-8d01-2503d7f7b5ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1578, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.348350', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da7bd07a-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': 'ccf598f483b3989e8810274ed5d5090134de09e04ea26fd1a4dc3ab3ea449433'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 52846, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.348350', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da7bdbce-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'd15f141e33f78943d529280e1380786a7de09ba28846004e1b41851e10e2ba64'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2516, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.348350', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da7be736-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '6f81981a35d540cbe58e6cef27d9af16ad28c1affe7c8f22054b04d4e666da4e'}]}, 'timestamp': '2025-12-11 09:57:50.349224', '_unique_id': 'ac4f4638d19f40eb8187511995fe06dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.349 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.350 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.350 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/memory.usage volume: 45.90234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.350 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/memory.usage volume: 43.6953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5866761-2c40-494e-92ea-58f105ffe35c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 45.90234375, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'timestamp': '2025-12-11T09:57:50.350663', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'instance-00000007', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da7c2ac0-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.947081057, 'message_signature': '7a303fc76acce829e6d80629cd40d29c94d465bd9bcdacecc37695b86cf38e65'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.6953125, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'timestamp': '2025-12-11T09:57:50.350663', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'instance-00000006', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da7c36aa-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.963174666, 'message_signature': 'a7fcfe47337b285da6cd594b0ba61ed1b2b80b30bacec817c97fe0a10b430c41'}]}, 'timestamp': '2025-12-11 09:57:50.351252', '_unique_id': '99d0570b1bdd4b9e8ea394490b520946'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.351 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.352 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.352 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1623364696>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1422762768>]
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.352 12 DEBUG ceilometer.compute.pollsters [-] 72b08ac6-c357-4b0b-8c3f-05df9e439a54/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.353 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.353 12 DEBUG ceilometer.compute.pollsters [-] d1980092-5559-4d0d-a6cc-184b22110cc4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c018283-d6c0-4d9a-8fd0-dd9b2edb5cc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000007-72b08ac6-c357-4b0b-8c3f-05df9e439a54-tapfc691b86-f6', 'timestamp': '2025-12-11T09:57:50.352902', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1623364696', 'name': 'tapfc691b86-f6', 'instance_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:2a:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc691b86-f6'}, 'message_id': 'da7c80f6-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.869136601, 'message_signature': 'e604c3534aec8ecfb61c6847f21b553833487c77cd10a7742a0e2da42283b4fc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap3787f1e5-cc', 'timestamp': '2025-12-11T09:57:50.352902', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap3787f1e5-cc', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:96:8b:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3787f1e5-cc'}, 'message_id': 'da7c8a56-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': '3c9fd05a2eaec6b100d3f0e84995dcbdc26726f1587dcd65cc2e65ae4669c0a9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-00000006-d1980092-5559-4d0d-a6cc-184b22110cc4-tap435350c0-ea', 'timestamp': '2025-12-11T09:57:50.352902', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1422762768', 'name': 'tap435350c0-ea', 'instance_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ca:5b:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap435350c0-ea'}, 'message_id': 'da7c926c-d677-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3428.87367307, 'message_signature': 'fe64f9b5f992d8370c719fe4362df5a554e2310913ea5180de07299f5a9c75c2'}]}, 'timestamp': '2025-12-11 09:57:50.353568', '_unique_id': 'bfe05ad2924e459389d617b0a7ff6ea7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:57:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:57:50.354 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:57:51 np0005555140 nova_compute[187006]: 2025-12-11 09:57:51.796 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.083 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:53.764 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:53.766 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:57:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:53.767 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.853 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.881 187010 DEBUG nova.compute.manager [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-changed-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.881 187010 DEBUG nova.compute.manager [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing instance network info cache due to event network-changed-435350c0-ea0e-4c79-8bc4-70f3e12c60c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.881 187010 DEBUG oslo_concurrency.lockutils [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.882 187010 DEBUG oslo_concurrency.lockutils [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.882 187010 DEBUG nova.network.neutron [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing network info cache for port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:57:53 np0005555140 nova_compute[187006]: 2025-12-11 09:57:53.905 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:54 np0005555140 nova_compute[187006]: 2025-12-11 09:57:54.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:57:54 np0005555140 nova_compute[187006]: 2025-12-11 09:57:54.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:57:54 np0005555140 nova_compute[187006]: 2025-12-11 09:57:54.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:57:54 np0005555140 nova_compute[187006]: 2025-12-11 09:57:54.951 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:57:56 np0005555140 nova_compute[187006]: 2025-12-11 09:57:56.800 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:57 np0005555140 nova_compute[187006]: 2025-12-11 09:57:57.744 187010 DEBUG nova.network.neutron [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated VIF entry in instance network info cache for port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:57:57 np0005555140 nova_compute[187006]: 2025-12-11 09:57:57.745 187010 DEBUG nova.network.neutron [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.086 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.419 187010 DEBUG oslo_concurrency.lockutils [req-b3a988a6-dcee-4a82-84cd-c199cdf70aff req-9ba73466-14dc-4093-8179-b080200ea210 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.420 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.421 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.421 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.476 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.477 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.477 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.477 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.477 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.478 187010 INFO nova.compute.manager [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Terminating instance#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.479 187010 DEBUG nova.compute.manager [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:57:58 np0005555140 kernel: tapfc691b86-f6 (unregistering): left promiscuous mode
Dec 11 04:57:58 np0005555140 NetworkManager[55531]: <info>  [1765447078.4997] device (tapfc691b86-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00098|binding|INFO|Releasing lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 from this chassis (sb_readonly=0)
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00099|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 down in Southbound
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.505 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00100|binding|INFO|Removing iface tapfc691b86-f6 ovn-installed in OVS
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.507 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.515 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:2a:6e 10.100.0.22'], port_security=['fa:16:3e:82:2a:6e 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c35273a-1805-4786-9147-92f6f3e7516e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=fc691b86-f650-4fef-ba47-39bbfc1eeff8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.516 104288 INFO neutron.agent.ovn.metadata.agent [-] Port fc691b86-f650-4fef-ba47-39bbfc1eeff8 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 unbound from our chassis#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.517 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.523 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.537 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b704d3b0-92b2-4db4-bb18-ed169191b608]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 11 04:57:58 np0005555140 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 12.904s CPU time.
Dec 11 04:57:58 np0005555140 systemd-machined[153398]: Machine qemu-7-instance-00000007 terminated.
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.568 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb437b8-70f2-44a0-84b5-5a4b2c8b123f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.571 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[5a2aa10f-66c0-4c43-ab2f-e92c546cba09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 podman[216695]: 2025-12-11 09:57:58.597802323 +0000 UTC m=+0.060881740 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.599 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[5a697a1e-562b-4001-bb2d-f8f8c7fa6f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.616 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[72b36b43-dc11-4acb-9dc7-3e19ce6ea8b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216730, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.632 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7e1ac5-66ae-442e-9423-e6eb8db85836]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339108, 'tstamp': 339108}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216731, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339110, 'tstamp': 339110}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216731, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.634 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.635 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.640 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.641 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac1787f0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.641 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.641 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac1787f0-40, col_values=(('external_ids', {'iface-id': '2448c0c3-3800-4261-a3ca-d11be99e9c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.642 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 kernel: tapfc691b86-f6: entered promiscuous mode
Dec 11 04:57:58 np0005555140 kernel: tapfc691b86-f6 (unregistering): left promiscuous mode
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00101|binding|INFO|Claiming lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 for this chassis.
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00102|binding|INFO|fc691b86-f650-4fef-ba47-39bbfc1eeff8: Claiming fa:16:3e:82:2a:6e 10.100.0.22
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.704 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.714 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:2a:6e 10.100.0.22'], port_security=['fa:16:3e:82:2a:6e 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c35273a-1805-4786-9147-92f6f3e7516e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=fc691b86-f650-4fef-ba47-39bbfc1eeff8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.715 104288 INFO neutron.agent.ovn.metadata.agent [-] Port fc691b86-f650-4fef-ba47-39bbfc1eeff8 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 bound to our chassis#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.716 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91#033[00m
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00103|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 ovn-installed in OVS
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.723 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00104|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 up in Southbound
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00105|binding|INFO|Releasing lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 from this chassis (sb_readonly=1)
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00106|if_status|INFO|Not setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 down as sb is readonly
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00107|binding|INFO|Removing iface tapfc691b86-f6 ovn-installed in OVS
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.725 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00108|binding|INFO|Releasing lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 from this chassis (sb_readonly=0)
Dec 11 04:57:58 np0005555140 ovn_controller[95438]: 2025-12-11T09:57:58Z|00109|binding|INFO|Setting lport fc691b86-f650-4fef-ba47-39bbfc1eeff8 down in Southbound
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.732 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[03a7bb35-3a65-4a3e-8398-dc28a2310539]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.735 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.737 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:2a:6e 10.100.0.22'], port_security=['fa:16:3e:82:2a:6e 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '72b08ac6-c357-4b0b-8c3f-05df9e439a54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c35273a-1805-4786-9147-92f6f3e7516e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=fc691b86-f650-4fef-ba47-39bbfc1eeff8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.738 187010 INFO nova.virt.libvirt.driver [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Instance destroyed successfully.#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.739 187010 DEBUG nova.objects.instance [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 72b08ac6-c357-4b0b-8c3f-05df9e439a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.750 187010 DEBUG nova.virt.libvirt.vif [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:57:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1623364696',display_name='tempest-TestNetworkBasicOps-server-1623364696',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1623364696',id=7,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIeHb0/t1D0SUyGLgaiIPGcB+qu02xGrxqrZw0NDgzlKndSZC7ZFLEhrd1LjC+GpQofYxUoA8F20+3zQUnXezxZH+vFzjqrOwyek358sCEFvhoxJQrpgUML5K8Ufsdpu2Q==',key_name='tempest-TestNetworkBasicOps-833561704',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:57:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-0um5fobf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:57:29Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=72b08ac6-c357-4b0b-8c3f-05df9e439a54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.750 187010 DEBUG nova.network.os_vif_util [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "address": "fa:16:3e:82:2a:6e", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc691b86-f6", "ovs_interfaceid": "fc691b86-f650-4fef-ba47-39bbfc1eeff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.751 187010 DEBUG nova.network.os_vif_util [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.751 187010 DEBUG os_vif [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.753 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.753 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc691b86-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.754 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.756 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.758 187010 INFO os_vif [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:2a:6e,bridge_name='br-int',has_traffic_filtering=True,id=fc691b86-f650-4fef-ba47-39bbfc1eeff8,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc691b86-f6')#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.759 187010 INFO nova.virt.libvirt.driver [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Deleting instance files /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54_del#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.759 187010 INFO nova.virt.libvirt.driver [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Deletion of /var/lib/nova/instances/72b08ac6-c357-4b0b-8c3f-05df9e439a54_del complete#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.760 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[dad93d97-5e57-45fb-a748-8428a9a98e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.763 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[6f24e451-3554-4c7b-8081-5f9de4755ad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.791 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[38fdd192-41d1-44ea-8c1f-521efb9f0ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.806 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[35b195e6-369f-48b9-a81f-83d9bf63d2aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216754, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.810 187010 INFO nova.compute.manager [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.810 187010 DEBUG oslo.service.loopingcall [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.810 187010 DEBUG nova.compute.manager [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.811 187010 DEBUG nova.network.neutron [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.825 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[08b37105-c9b4-40b9-b35b-72f701cb4f48]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339108, 'tstamp': 339108}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216755, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339110, 'tstamp': 339110}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216755, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.826 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.827 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.828 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.829 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac1787f0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.829 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.829 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac1787f0-40, col_values=(('external_ids', {'iface-id': '2448c0c3-3800-4261-a3ca-d11be99e9c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.830 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.831 104288 INFO neutron.agent.ovn.metadata.agent [-] Port fc691b86-f650-4fef-ba47-39bbfc1eeff8 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 unbound from our chassis#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.832 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.848 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[909a8c47-64b9-4391-87e7-d1372c4928ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.884 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[51cf0357-db38-4bea-925b-31d39f8f702d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.887 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4dbab1-2d14-4f3e-a015-45cb03e227ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.917 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2ce3e3-9197-4a8a-bfa8-aa57919c0051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.932 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6af1d433-5f62-48ae-b02f-676f7adec54c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac1787f0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:5b:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339098, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216761, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.951 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[892902bf-07e8-4ee4-bfdf-b884ea4fcaf7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339108, 'tstamp': 339108}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216762, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapac1787f0-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 339110, 'tstamp': 339110}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216762, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.953 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.955 187010 DEBUG nova.compute.manager [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-unplugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.955 187010 DEBUG oslo_concurrency.lockutils [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.955 187010 DEBUG oslo_concurrency.lockutils [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.956 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac1787f0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.956 187010 DEBUG oslo_concurrency.lockutils [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.956 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.956 187010 DEBUG nova.compute.manager [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] No waiting events found dispatching network-vif-unplugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.956 187010 DEBUG nova.compute.manager [req-f33dae22-cd58-4d51-bc95-a7181faec0bd req-31f6ec12-52bd-4599-989d-12f1e83f5942 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-unplugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.956 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac1787f0-40, col_values=(('external_ids', {'iface-id': '2448c0c3-3800-4261-a3ca-d11be99e9c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:57:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:57:58.956 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:57:58 np0005555140 nova_compute[187006]: 2025-12-11 09:57:58.959 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.387 187010 DEBUG nova.network.neutron [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.407 187010 INFO nova.compute.manager [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Took 0.60 seconds to deallocate network for instance.#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.457 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.458 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.472 187010 DEBUG nova.compute.manager [req-4b0be94d-a7da-4ff5-87d8-4f539363c70f req-ddca631e-49ae-4326-aa5d-1277070bd9d6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-deleted-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.498 187010 DEBUG nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing inventories for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.513 187010 DEBUG nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating ProviderTree inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.513 187010 DEBUG nova.compute.provider_tree [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.532 187010 DEBUG nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing aggregate associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.554 187010 DEBUG nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Refreshing trait associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,HW_CPU_X86_SVM,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.613 187010 DEBUG nova.compute.provider_tree [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.633 187010 DEBUG nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.660 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.689 187010 INFO nova.scheduler.client.report [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 72b08ac6-c357-4b0b-8c3f-05df9e439a54#033[00m
Dec 11 04:57:59 np0005555140 nova_compute[187006]: 2025-12-11 09:57:59.742 187010 DEBUG oslo_concurrency.lockutils [None req-eab6aabe-35fd-4311-b432-6d29023a6f72 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.442 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-435350c0-ea0e-4c79-8bc4-70f3e12c60c2" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.442 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-435350c0-ea0e-4c79-8bc4-70f3e12c60c2" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.459 187010 DEBUG nova.objects.instance [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'flavor' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.479 187010 DEBUG nova.virt.libvirt.vif [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.480 187010 DEBUG nova.network.os_vif_util [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.480 187010 DEBUG nova.network.os_vif_util [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.483 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.486 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.489 187010 DEBUG nova.virt.libvirt.driver [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Attempting to detach device tap435350c0-ea from instance d1980092-5559-4d0d-a6cc-184b22110cc4 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.490 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] detach device xml: <interface type="ethernet">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:ca:5b:f8"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <target dev="tap435350c0-ea"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.498 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.502 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <name>instance-00000006</name>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <uuid>d1980092-5559-4d0d-a6cc-184b22110cc4</uuid>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:57:12</nova:creationTime>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:port uuid="435350c0-ea0e-4c79-8bc4-70f3e12c60c2">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='serial'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='uuid'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk' index='2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config' index='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:96:8b:66'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='tap3787f1e5-cc'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:ca:5b:f8'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='tap435350c0-ea'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='net1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c133,c349</label>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c133,c349</imagelabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.502 187010 INFO nova.virt.libvirt.driver [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully detached device tap435350c0-ea from instance d1980092-5559-4d0d-a6cc-184b22110cc4 from the persistent domain config.#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.504 187010 DEBUG nova.virt.libvirt.driver [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] (1/8): Attempting to detach device tap435350c0-ea with device alias net1 from instance d1980092-5559-4d0d-a6cc-184b22110cc4 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.504 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] detach device xml: <interface type="ethernet">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <mac address="fa:16:3e:ca:5b:f8"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <model type="virtio"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <mtu size="1442"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <target dev="tap435350c0-ea"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </interface>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec 11 04:58:00 np0005555140 kernel: tap435350c0-ea (unregistering): left promiscuous mode
Dec 11 04:58:00 np0005555140 NetworkManager[55531]: <info>  [1765447080.6023] device (tap435350c0-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.610 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:00Z|00110|binding|INFO|Releasing lport 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 from this chassis (sb_readonly=0)
Dec 11 04:58:00 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:00Z|00111|binding|INFO|Setting lport 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 down in Southbound
Dec 11 04:58:00 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:00Z|00112|binding|INFO|Removing iface tap435350c0-ea ovn-installed in OVS
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.612 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.623 187010 DEBUG nova.virt.libvirt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Received event <DeviceRemovedEvent: 1765447080.6228242, d1980092-5559-4d0d-a6cc-184b22110cc4 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.623 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.624 187010 DEBUG nova.virt.libvirt.driver [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Start waiting for the detach event from libvirt for device tap435350c0-ea with device alias net1 for instance d1980092-5559-4d0d-a6cc-184b22110cc4 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.624 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.627 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <name>instance-00000006</name>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <uuid>d1980092-5559-4d0d-a6cc-184b22110cc4</uuid>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:57:12</nova:creationTime>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:port uuid="435350c0-ea0e-4c79-8bc4-70f3e12c60c2">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='serial'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='uuid'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk' index='2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config' index='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:96:8b:66'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target dev='tap3787f1e5-cc'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c133,c349</label>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c133,c349</imagelabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.628 187010 INFO nova.virt.libvirt.driver [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully detached device tap435350c0-ea from instance d1980092-5559-4d0d-a6cc-184b22110cc4 from the live domain config.#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.630 187010 DEBUG nova.virt.libvirt.vif [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.630 187010 DEBUG nova.network.os_vif_util [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.631 187010 DEBUG nova.network.os_vif_util [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.631 187010 DEBUG os_vif [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.633 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.633 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap435350c0-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.634 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:00.635 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:5b:f8 10.100.0.25', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=269634f1-de2c-4f8c-bf63-bd8d66b32201, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=435350c0-ea0e-4c79-8bc4-70f3e12c60c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.636 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:00 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:00.636 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 in datapath ac1787f0-46c0-4ede-92c6-bec41e2a3f91 unbound from our chassis#033[00m
Dec 11 04:58:00 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:00.637 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac1787f0-46c0-4ede-92c6-bec41e2a3f91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:00 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:00.638 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[93f7c0f6-66ea-438d-aa11-bd6d35394526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:00 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:00.638 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91 namespace which is not needed anymore#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.638 187010 INFO os_vif [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea')#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.639 187010 DEBUG nova.virt.libvirt.guest [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:58:00</nova:creationTime>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:00 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:00 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:00 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.750 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.773 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.774 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.774 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.774 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.774 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.775 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.775 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.775 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.798 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.799 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.799 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.799 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.875 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.934 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:00 np0005555140 nova_compute[187006]: 2025-12-11 09:58:00.935 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:00 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [NOTICE]   (216379) : haproxy version is 2.8.14-c23fe91
Dec 11 04:58:00 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [NOTICE]   (216379) : path to executable is /usr/sbin/haproxy
Dec 11 04:58:00 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [WARNING]  (216379) : Exiting Master process...
Dec 11 04:58:00 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [ALERT]    (216379) : Current worker (216381) exited with code 143 (Terminated)
Dec 11 04:58:00 np0005555140 neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91[216375]: [WARNING]  (216379) : All workers exited. Exiting... (0)
Dec 11 04:58:00 np0005555140 systemd[1]: libpod-1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1.scope: Deactivated successfully.
Dec 11 04:58:00 np0005555140 podman[216784]: 2025-12-11 09:58:00.981168314 +0000 UTC m=+0.266365928 container died 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.005 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.068 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.069 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.069 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.069 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.070 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] No waiting events found dispatching network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.070 187010 WARNING nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received unexpected event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.070 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.070 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.070 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.071 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "72b08ac6-c357-4b0b-8c3f-05df9e439a54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.071 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] No waiting events found dispatching network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.071 187010 WARNING nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Received unexpected event network-vif-plugged-fc691b86-f650-4fef-ba47-39bbfc1eeff8 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.071 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-unplugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.071 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.072 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.072 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.072 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-unplugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.072 187010 WARNING nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-unplugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.072 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.073 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.073 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.073 187010 DEBUG oslo_concurrency.lockutils [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.073 187010 DEBUG nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.074 187010 WARNING nova.compute.manager [req-4e321a26-2933-47e4-8cf0-3dcf9283ac1e req-6cc0750a-f6a8-468b-bb49-ad78bf50a112 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-plugged-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.185 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.186 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5489MB free_disk=73.29883193969727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.186 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.186 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.190 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.191 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.191 187010 DEBUG nova.network.neutron [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.243 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance d1980092-5559-4d0d-a6cc-184b22110cc4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.243 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.243 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.292 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.306 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.324 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.325 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:01 np0005555140 systemd[1]: var-lib-containers-storage-overlay-29b849e4770ca2c6b6e52bd7128140826e288c627769f81f53ba7c5161a0babc-merged.mount: Deactivated successfully.
Dec 11 04:58:01 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1-userdata-shm.mount: Deactivated successfully.
Dec 11 04:58:01 np0005555140 podman[216784]: 2025-12-11 09:58:01.362142235 +0000 UTC m=+0.647339839 container cleanup 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:58:01 np0005555140 podman[216820]: 2025-12-11 09:58:01.425789063 +0000 UTC m=+0.039510609 container remove 1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.432 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e11d09bc-5562-4238-9b40-529dc4f763b8]: (4, ('Thu Dec 11 09:58:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91 (1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1)\n1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1\nThu Dec 11 09:58:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91 (1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1)\n1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.434 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[01dc23aa-3bc8-4e5b-98dc-d44cca77c5cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.435 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac1787f0-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.437 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:01 np0005555140 kernel: tapac1787f0-40: left promiscuous mode
Dec 11 04:58:01 np0005555140 systemd[1]: libpod-conmon-1ac92d23f354e222e0e624294bdc8d8e92ebba44d08b0d43aa6ecf96de56f4c1.scope: Deactivated successfully.
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.450 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.454 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e99723d4-d4ed-4615-9d0d-156df0096584]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.466 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7edbdd59-bca0-44c7-9873-aa9de02191cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.468 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[39d94827-38f9-4268-931c-4aee6a345d25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.484 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a158bdad-2103-4f38-a08f-0d98961a11c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 339092, 'reachable_time': 15094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216835, 'error': None, 'target': 'ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.487 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac1787f0-46c0-4ede-92c6-bec41e2a3f91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:58:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:01.487 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[46b25e1a-47d3-4110-80aa-594c14794b32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:01 np0005555140 systemd[1]: run-netns-ovnmeta\x2dac1787f0\x2d46c0\x2d4ede\x2d92c6\x2dbec41e2a3f91.mount: Deactivated successfully.
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.560 187010 DEBUG nova.compute.manager [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-deleted-435350c0-ea0e-4c79-8bc4-70f3e12c60c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.560 187010 INFO nova.compute.manager [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Neutron deleted interface 435350c0-ea0e-4c79-8bc4-70f3e12c60c2; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.561 187010 DEBUG nova.network.neutron [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.582 187010 DEBUG nova.objects.instance [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lazy-loading 'system_metadata' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.607 187010 DEBUG nova.objects.instance [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lazy-loading 'flavor' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.635 187010 DEBUG nova.virt.libvirt.vif [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.636 187010 DEBUG nova.network.os_vif_util [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.636 187010 DEBUG nova.network.os_vif_util [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.642 187010 DEBUG nova.virt.libvirt.guest [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.648 187010 DEBUG nova.virt.libvirt.guest [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <name>instance-00000006</name>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <uuid>d1980092-5559-4d0d-a6cc-184b22110cc4</uuid>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:58:00</nova:creationTime>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='serial'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='uuid'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk' index='2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config' index='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:96:8b:66'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='tap3787f1e5-cc'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c133,c349</label>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c133,c349</imagelabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.650 187010 DEBUG nova.virt.libvirt.guest [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.654 187010 DEBUG nova.virt.libvirt.guest [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:ca:5b:f8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap435350c0-ea"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <name>instance-00000006</name>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <uuid>d1980092-5559-4d0d-a6cc-184b22110cc4</uuid>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:58:00</nova:creationTime>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <memory unit='KiB'>131072</memory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <vcpu placement='static'>1</vcpu>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <resource>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <partition>/machine</partition>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </resource>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <sysinfo type='smbios'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='manufacturer'>RDO</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='product'>OpenStack Compute</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='serial'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='uuid'>d1980092-5559-4d0d-a6cc-184b22110cc4</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <entry name='family'>Virtual Machine</entry>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <boot dev='hd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <smbios mode='sysinfo'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <vmcoreinfo state='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <cpu mode='custom' match='exact' check='full'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <model fallback='forbid'>EPYC-Rome</model>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <vendor>AMD</vendor>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='x2apic'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc-deadline'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='hypervisor'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='tsc_adjust'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='spec-ctrl'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='stibp'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='cmp_legacy'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='overflow-recov'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='succor'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='ibrs'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='amd-ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='virt-ssbd'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='lbrv'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='tsc-scale'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='vmcb-clean'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='flushbyasid'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pause-filter'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='pfthreshold'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svme-addr-chk'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='lfence-always-serializing'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='xsaves'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='svm'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='require' name='topoext'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='npt'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <feature policy='disable' name='nrip-save'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <clock offset='utc'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='pit' tickpolicy='delay'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='rtc' tickpolicy='catchup'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <timer name='hpet' present='no'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_poweroff>destroy</on_poweroff>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_reboot>restart</on_reboot>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <on_crash>destroy</on_crash>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <disk type='file' device='disk'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk' index='2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backingStore type='file' index='3'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <format type='raw'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <source file='/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <backingStore/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      </backingStore>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='vda' bus='virtio'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='virtio-disk0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <disk type='file' device='cdrom'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='qemu' type='raw' cache='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/disk.config' index='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backingStore/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='sda' bus='sata'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <readonly/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='sata0-0-0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='0' model='pcie-root'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pcie.0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='1' port='0x10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='2' port='0x11'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='3' port='0x12'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='4' port='0x13'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='5' port='0x14'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='6' port='0x15'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='7' port='0x16'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='8' port='0x17'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.8'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='9' port='0x18'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.9'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='10' port='0x19'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='11' port='0x1a'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.11'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='12' port='0x1b'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.12'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='13' port='0x1c'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.13'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='14' port='0x1d'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.14'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='15' port='0x1e'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.15'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='16' port='0x1f'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.16'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='17' port='0x20'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.17'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='18' port='0x21'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.18'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='19' port='0x22'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.19'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='20' port='0x23'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.20'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='21' port='0x24'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.21'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='22' port='0x25'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.22'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='23' port='0x26'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.23'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='24' port='0x27'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.24'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-root-port'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target chassis='25' port='0x28'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.25'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model name='pcie-pci-bridge'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='pci.26'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='usb'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <controller type='sata' index='0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='ide'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </controller>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <interface type='ethernet'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <mac address='fa:16:3e:96:8b:66'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target dev='tap3787f1e5-cc'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model type='virtio'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <driver name='vhost' rx_queue_size='512'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <mtu size='1442'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='net0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <serial type='pty'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target type='isa-serial' port='0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:        <model name='isa-serial'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      </target>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <console type='pty' tty='/dev/pts/0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <source path='/dev/pts/0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <log file='/var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4/console.log' append='off'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <target type='serial' port='0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='serial0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </console>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='tablet' bus='usb'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='usb' bus='0' port='1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='mouse' bus='ps2'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input1'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <input type='keyboard' bus='ps2'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='input2'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </input>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <listen type='address' address='::0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </graphics>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <audio id='1' type='none'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <model type='virtio' heads='1' primary='yes'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='video0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <watchdog model='itco' action='reset'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='watchdog0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </watchdog>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <memballoon model='virtio'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <stats period='10'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='balloon0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <rng model='virtio'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <backend model='random'>/dev/urandom</backend>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <alias name='rng0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <label>system_u:system_r:svirt_t:s0:c133,c349</label>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c133,c349</imagelabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <label>+107:+107</label>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <imagelabel>+107:+107</imagelabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </seclabel>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.654 187010 WARNING nova.virt.libvirt.driver [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Detaching interface fa:16:3e:ca:5b:f8 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap435350c0-ea' not found.#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.655 187010 DEBUG nova.virt.libvirt.vif [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.656 187010 DEBUG nova.network.os_vif_util [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converting VIF {"id": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "address": "fa:16:3e:ca:5b:f8", "network": {"id": "ac1787f0-46c0-4ede-92c6-bec41e2a3f91", "bridge": "br-int", "label": "tempest-network-smoke--90176683", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap435350c0-ea", "ovs_interfaceid": "435350c0-ea0e-4c79-8bc4-70f3e12c60c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.656 187010 DEBUG nova.network.os_vif_util [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.656 187010 DEBUG os_vif [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.658 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.658 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap435350c0-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.659 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.662 187010 INFO os_vif [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:5b:f8,bridge_name='br-int',has_traffic_filtering=True,id=435350c0-ea0e-4c79-8bc4-70f3e12c60c2,network=Network(ac1787f0-46c0-4ede-92c6-bec41e2a3f91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap435350c0-ea')#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.662 187010 DEBUG nova.virt.libvirt.guest [req-ba84cf1a-26e6-42f1-93df-eef16ff23710 req-8c863dfe-b2f3-42cc-9a15-ceedca17916e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:name>tempest-TestNetworkBasicOps-server-1422762768</nova:name>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:creationTime>2025-12-11 09:58:01</nova:creationTime>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:flavor name="m1.nano">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:memory>128</nova:memory>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:disk>1</nova:disk>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:swap>0</nova:swap>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:flavor>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:owner>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  <nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    <nova:port uuid="3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5">
Dec 11 04:58:01 np0005555140 nova_compute[187006]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:    </nova:port>
Dec 11 04:58:01 np0005555140 nova_compute[187006]:  </nova:ports>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: </nova:instance>
Dec 11 04:58:01 np0005555140 nova_compute[187006]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec 11 04:58:01 np0005555140 nova_compute[187006]: 2025-12-11 09:58:01.801 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:02 np0005555140 nova_compute[187006]: 2025-12-11 09:58:02.324 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:02 np0005555140 nova_compute[187006]: 2025-12-11 09:58:02.324 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:03 np0005555140 nova_compute[187006]: 2025-12-11 09:58:03.462 187010 INFO nova.network.neutron [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Port 435350c0-ea0e-4c79-8bc4-70f3e12c60c2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec 11 04:58:03 np0005555140 nova_compute[187006]: 2025-12-11 09:58:03.463 187010 DEBUG nova.network.neutron [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:03 np0005555140 nova_compute[187006]: 2025-12-11 09:58:03.502 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:03 np0005555140 nova_compute[187006]: 2025-12-11 09:58:03.526 187010 DEBUG oslo_concurrency.lockutils [None req-168d62c0-ff3a-4302-b926-c973c6a03788 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "interface-d1980092-5559-4d0d-a6cc-184b22110cc4-435350c0-ea0e-4c79-8bc4-70f3e12c60c2" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:03 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:03Z|00113|binding|INFO|Releasing lport be72e3a5-1e58-485d-bd82-34fa36f9e281 from this chassis (sb_readonly=0)
Dec 11 04:58:03 np0005555140 nova_compute[187006]: 2025-12-11 09:58:03.652 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.896 187010 DEBUG nova.compute.manager [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.896 187010 DEBUG nova.compute.manager [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing instance network info cache due to event network-changed-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.897 187010 DEBUG oslo_concurrency.lockutils [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.897 187010 DEBUG oslo_concurrency.lockutils [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.898 187010 DEBUG nova.network.neutron [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Refreshing network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.984 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.984 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.984 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.984 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.985 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.986 187010 INFO nova.compute.manager [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Terminating instance#033[00m
Dec 11 04:58:04 np0005555140 nova_compute[187006]: 2025-12-11 09:58:04.987 187010 DEBUG nova.compute.manager [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:58:05 np0005555140 kernel: tap3787f1e5-cc (unregistering): left promiscuous mode
Dec 11 04:58:05 np0005555140 NetworkManager[55531]: <info>  [1765447085.0179] device (tap3787f1e5-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.027 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:05Z|00114|binding|INFO|Releasing lport 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 from this chassis (sb_readonly=0)
Dec 11 04:58:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:05Z|00115|binding|INFO|Setting lport 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 down in Southbound
Dec 11 04:58:05 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:05Z|00116|binding|INFO|Removing iface tap3787f1e5-cc ovn-installed in OVS
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.030 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:05.036 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:8b:66 10.100.0.8'], port_security=['fa:16:3e:96:8b:66 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd1980092-5559-4d0d-a6cc-184b22110cc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6076e77-6bbe-422f-b8fa-650192fcd178', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f4b45b33-eea6-4cc0-a8ef-ccc6fc496694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e75b9db-d641-4613-b412-1310e816e31f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:05.038 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 in datapath f6076e77-6bbe-422f-b8fa-650192fcd178 unbound from our chassis#033[00m
Dec 11 04:58:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:05.039 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6076e77-6bbe-422f-b8fa-650192fcd178, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:05.040 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f3431d8d-f87b-4b53-afb1-b72eb7bb1bc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:05 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:05.040 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178 namespace which is not needed anymore#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.054 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:05 np0005555140 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 11 04:58:05 np0005555140 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 15.971s CPU time.
Dec 11 04:58:05 np0005555140 systemd-machined[153398]: Machine qemu-6-instance-00000006 terminated.
Dec 11 04:58:05 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [NOTICE]   (216105) : haproxy version is 2.8.14-c23fe91
Dec 11 04:58:05 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [NOTICE]   (216105) : path to executable is /usr/sbin/haproxy
Dec 11 04:58:05 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [WARNING]  (216105) : Exiting Master process...
Dec 11 04:58:05 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [ALERT]    (216105) : Current worker (216107) exited with code 143 (Terminated)
Dec 11 04:58:05 np0005555140 neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178[216101]: [WARNING]  (216105) : All workers exited. Exiting... (0)
Dec 11 04:58:05 np0005555140 systemd[1]: libpod-7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02.scope: Deactivated successfully.
Dec 11 04:58:05 np0005555140 podman[216858]: 2025-12-11 09:58:05.164426672 +0000 UTC m=+0.041847566 container died 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:58:05 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02-userdata-shm.mount: Deactivated successfully.
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.195 187010 DEBUG nova.compute.manager [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-unplugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.196 187010 DEBUG oslo_concurrency.lockutils [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.196 187010 DEBUG oslo_concurrency.lockutils [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.196 187010 DEBUG oslo_concurrency.lockutils [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.196 187010 DEBUG nova.compute.manager [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-unplugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.197 187010 DEBUG nova.compute.manager [req-d73f33c5-f255-4611-a9dd-3dd043f5eccc req-d12e366c-a7d0-4746-afc9-8d621688c530 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-unplugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:58:05 np0005555140 systemd[1]: var-lib-containers-storage-overlay-3b004498cd56dcdff757b472a1f71b35de974e9e5cd5304a7ab813bc40ac459f-merged.mount: Deactivated successfully.
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.246 187010 INFO nova.virt.libvirt.driver [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance destroyed successfully.#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.246 187010 DEBUG nova.objects.instance [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid d1980092-5559-4d0d-a6cc-184b22110cc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.260 187010 DEBUG nova.virt.libvirt.vif [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1422762768',display_name='tempest-TestNetworkBasicOps-server-1422762768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1422762768',id=6,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByxFD8+ASGKOr6wVACNLsJcvLTAwL8EypnFrimkzlHHKvWkvvN+UuGccoqlOmQPfgNrzd+uXeHqaFcIY+PGJRkmvL7Yi+LZFWCh701K/qcyTVeA3J3On2cYSZ9g/kDfvg==',key_name='tempest-TestNetworkBasicOps-1939808554',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:56:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-5wxp9oui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:56:41Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=d1980092-5559-4d0d-a6cc-184b22110cc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.260 187010 DEBUG nova.network.os_vif_util [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.261 187010 DEBUG nova.network.os_vif_util [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.262 187010 DEBUG os_vif [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.264 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.264 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3787f1e5-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.266 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.268 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.271 187010 INFO os_vif [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:8b:66,bridge_name='br-int',has_traffic_filtering=True,id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5,network=Network(f6076e77-6bbe-422f-b8fa-650192fcd178),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3787f1e5-cc')#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.271 187010 INFO nova.virt.libvirt.driver [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Deleting instance files /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4_del#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.272 187010 INFO nova.virt.libvirt.driver [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Deletion of /var/lib/nova/instances/d1980092-5559-4d0d-a6cc-184b22110cc4_del complete#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.325 187010 INFO nova.compute.manager [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.325 187010 DEBUG oslo.service.loopingcall [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.326 187010 DEBUG nova.compute.manager [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.326 187010 DEBUG nova.network.neutron [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:58:05 np0005555140 podman[216858]: 2025-12-11 09:58:05.871840356 +0000 UTC m=+0.749261250 container cleanup 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.878 187010 DEBUG nova.network.neutron [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:05 np0005555140 systemd[1]: libpod-conmon-7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02.scope: Deactivated successfully.
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.900 187010 INFO nova.compute.manager [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Took 0.57 seconds to deallocate network for instance.#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.950 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:05 np0005555140 nova_compute[187006]: 2025-12-11 09:58:05.950 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.002 187010 DEBUG nova.compute.provider_tree [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.021 187010 DEBUG nova.scheduler.client.report [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.044 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.069 187010 INFO nova.scheduler.client.report [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance d1980092-5559-4d0d-a6cc-184b22110cc4#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.106 187010 DEBUG nova.network.neutron [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updated VIF entry in instance network info cache for port 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.107 187010 DEBUG nova.network.neutron [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Updating instance_info_cache with network_info: [{"id": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "address": "fa:16:3e:96:8b:66", "network": {"id": "f6076e77-6bbe-422f-b8fa-650192fcd178", "bridge": "br-int", "label": "tempest-network-smoke--587589952", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3787f1e5-cc", "ovs_interfaceid": "3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.134 187010 DEBUG oslo_concurrency.lockutils [None req-4fc17f68-4407-4b33-ab56-648148bd9bbf 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.136 187010 DEBUG oslo_concurrency.lockutils [req-153bc19a-cb95-4c75-bbe1-49d010606854 req-35ab1f4f-ec9b-446a-899c-c5b3796d865a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-d1980092-5559-4d0d-a6cc-184b22110cc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:06 np0005555140 podman[216906]: 2025-12-11 09:58:06.417576633 +0000 UTC m=+0.524305556 container remove 7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.423 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4af65248-2774-47ec-ae2a-c205db40041a]: (4, ('Thu Dec 11 09:58:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178 (7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02)\n7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02\nThu Dec 11 09:58:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178 (7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02)\n7eb68b2a5e8dfc1931b95d5f7a36c05304c5ef4cad2d3856e2f1fda3acd38d02\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.425 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[756fabd1-342f-4920-8a8f-6a4effa6da14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.426 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6076e77-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.428 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:06 np0005555140 kernel: tapf6076e77-60: left promiscuous mode
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.442 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.446 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3a3a4a-2423-4cda-8035-8ba02a0e7121]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.462 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[53a0b1af-177d-4e65-96c3-c2deca46baed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.464 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[eca25a30-6e8f-413d-a62f-b366674d90cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.480 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b618e241-a394-4d53-82a1-16c216f2b536]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 335880, 'reachable_time': 31877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216922, 'error': None, 'target': 'ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.482 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6076e77-6bbe-422f-b8fa-650192fcd178 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:58:06 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:06.482 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2274d0-3e30-404b-a813-8fc76bfb3ffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:06 np0005555140 systemd[1]: run-netns-ovnmeta\x2df6076e77\x2d6bbe\x2d422f\x2db8fa\x2d650192fcd178.mount: Deactivated successfully.
Dec 11 04:58:06 np0005555140 nova_compute[187006]: 2025-12-11 09:58:06.803 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.281 187010 DEBUG nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.281 187010 DEBUG oslo_concurrency.lockutils [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.281 187010 DEBUG oslo_concurrency.lockutils [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.282 187010 DEBUG oslo_concurrency.lockutils [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "d1980092-5559-4d0d-a6cc-184b22110cc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.282 187010 DEBUG nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] No waiting events found dispatching network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.282 187010 WARNING nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received unexpected event network-vif-plugged-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.282 187010 DEBUG nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Received event network-vif-deleted-3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.283 187010 INFO nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Neutron deleted interface 3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.283 187010 DEBUG nova.network.neutron [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Dec 11 04:58:07 np0005555140 nova_compute[187006]: 2025-12-11 09:58:07.285 187010 DEBUG nova.compute.manager [req-c6bd1c6d-3527-4ca7-b6f2-d28dff913dbf req-4448e52d-7d7c-41ac-b4b6-6531bfe3532f b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Detach interface failed, port_id=3787f1e5-cc0c-4cf0-ab99-3fe898ce54f5, reason: Instance d1980092-5559-4d0d-a6cc-184b22110cc4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec 11 04:58:07 np0005555140 podman[216924]: 2025-12-11 09:58:07.702718468 +0000 UTC m=+0.065058349 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:58:07 np0005555140 podman[216923]: 2025-12-11 09:58:07.708844364 +0000 UTC m=+0.071036310 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 04:58:10 np0005555140 nova_compute[187006]: 2025-12-11 09:58:10.064 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:10 np0005555140 nova_compute[187006]: 2025-12-11 09:58:10.142 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:10 np0005555140 nova_compute[187006]: 2025-12-11 09:58:10.266 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:11 np0005555140 nova_compute[187006]: 2025-12-11 09:58:11.804 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:12 np0005555140 podman[216964]: 2025-12-11 09:58:12.720734058 +0000 UTC m=+0.092509563 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:58:13 np0005555140 nova_compute[187006]: 2025-12-11 09:58:13.736 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447078.734767, 72b08ac6-c357-4b0b-8c3f-05df9e439a54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:13 np0005555140 nova_compute[187006]: 2025-12-11 09:58:13.736 187010 INFO nova.compute.manager [-] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:58:13 np0005555140 nova_compute[187006]: 2025-12-11 09:58:13.764 187010 DEBUG nova.compute.manager [None req-3b797d93-676b-423d-af9c-b4efcbd0fd8a - - - - - -] [instance: 72b08ac6-c357-4b0b-8c3f-05df9e439a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:15 np0005555140 nova_compute[187006]: 2025-12-11 09:58:15.269 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:16 np0005555140 nova_compute[187006]: 2025-12-11 09:58:16.806 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:17 np0005555140 podman[216983]: 2025-12-11 09:58:17.691181608 +0000 UTC m=+0.060263582 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 04:58:19 np0005555140 podman[217008]: 2025-12-11 09:58:19.689312058 +0000 UTC m=+0.062635670 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, version=9.6)
Dec 11 04:58:19 np0005555140 podman[217007]: 2025-12-11 09:58:19.713597042 +0000 UTC m=+0.086441150 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 04:58:20 np0005555140 nova_compute[187006]: 2025-12-11 09:58:20.245 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447085.2442317, d1980092-5559-4d0d-a6cc-184b22110cc4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:20 np0005555140 nova_compute[187006]: 2025-12-11 09:58:20.245 187010 INFO nova.compute.manager [-] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:58:20 np0005555140 nova_compute[187006]: 2025-12-11 09:58:20.266 187010 DEBUG nova.compute.manager [None req-7db178b2-7bb7-4046-bd7d-c0327d0c5567 - - - - - -] [instance: d1980092-5559-4d0d-a6cc-184b22110cc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:20 np0005555140 nova_compute[187006]: 2025-12-11 09:58:20.271 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:21 np0005555140 nova_compute[187006]: 2025-12-11 09:58:21.808 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:25 np0005555140 nova_compute[187006]: 2025-12-11 09:58:25.274 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:26 np0005555140 nova_compute[187006]: 2025-12-11 09:58:26.810 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.003 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.004 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.021 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.124 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.125 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.134 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.134 187010 INFO nova.compute.claims [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.264 187010 DEBUG nova.compute.provider_tree [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.283 187010 DEBUG nova.scheduler.client.report [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.305 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.306 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.358 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.358 187010 DEBUG nova.network.neutron [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.381 187010 INFO nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.402 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.509 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.510 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.511 187010 INFO nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Creating image(s)#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.511 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.512 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.512 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.529 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.603 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.605 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.607 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.635 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.693 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.694 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:29 np0005555140 podman[217057]: 2025-12-11 09:58:29.702365961 +0000 UTC m=+0.080299164 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.728 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.729 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.729 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.784 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.785 187010 DEBUG nova.virt.disk.api [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.785 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.840 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.841 187010 DEBUG nova.virt.disk.api [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.842 187010 DEBUG nova.objects.instance [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.856 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.857 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Ensure instance console log exists: /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.857 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.857 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.858 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:29 np0005555140 nova_compute[187006]: 2025-12-11 09:58:29.892 187010 DEBUG nova.policy [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:58:30 np0005555140 nova_compute[187006]: 2025-12-11 09:58:30.277 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:30 np0005555140 nova_compute[187006]: 2025-12-11 09:58:30.980 187010 DEBUG nova.network.neutron [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Successfully updated port: 9e968a26-a419-4ea6-9bbb-3dbbae785009 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:58:30 np0005555140 nova_compute[187006]: 2025-12-11 09:58:30.995 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:30 np0005555140 nova_compute[187006]: 2025-12-11 09:58:30.995 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:30 np0005555140 nova_compute[187006]: 2025-12-11 09:58:30.995 187010 DEBUG nova.network.neutron [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.087 187010 DEBUG nova.compute.manager [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.088 187010 DEBUG nova.compute.manager [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Refreshing instance network info cache due to event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.088 187010 DEBUG oslo_concurrency.lockutils [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.136 187010 DEBUG nova.network.neutron [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.812 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.866 187010 DEBUG nova.network.neutron [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updating instance_info_cache with network_info: [{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.892 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.893 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Instance network_info: |[{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.895 187010 DEBUG oslo_concurrency.lockutils [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.895 187010 DEBUG nova.network.neutron [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Refreshing network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.899 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Start _get_guest_xml network_info=[{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.905 187010 WARNING nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.910 187010 DEBUG nova.virt.libvirt.host [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.910 187010 DEBUG nova.virt.libvirt.host [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.917 187010 DEBUG nova.virt.libvirt.host [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.918 187010 DEBUG nova.virt.libvirt.host [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.918 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.919 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.919 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.919 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.920 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.920 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.920 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.920 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.921 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.921 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.921 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.921 187010 DEBUG nova.virt.hardware [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.925 187010 DEBUG nova.virt.libvirt.vif [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:58:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-903209475',display_name='tempest-TestNetworkBasicOps-server-903209475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-903209475',id=8,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDIB+sTAjOxDirmsYO12KwMWnYxz2zxnfGzNa7ZMfh7soCBbAbGVmdvajzfwY1shGPC4d9o9lbb4shCNor0v65EY185fBVmGjN7vRuQEh2+7lcFZO7btIaaCPI5shI3VHw==',key_name='tempest-TestNetworkBasicOps-116888488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-h6300wgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:58:29Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=7a4b7b8b-7dc3-454d-a5b3-70a319a09751,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.926 187010 DEBUG nova.network.os_vif_util [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.927 187010 DEBUG nova.network.os_vif_util [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.927 187010 DEBUG nova.objects.instance [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.942 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <uuid>7a4b7b8b-7dc3-454d-a5b3-70a319a09751</uuid>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <name>instance-00000008</name>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-903209475</nova:name>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:58:31</nova:creationTime>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        <nova:port uuid="9e968a26-a419-4ea6-9bbb-3dbbae785009">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="serial">7a4b7b8b-7dc3-454d-a5b3-70a319a09751</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="uuid">7a4b7b8b-7dc3-454d-a5b3-70a319a09751</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.config"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:20:56:19"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <target dev="tap9e968a26-a4"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/console.log" append="off"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:58:31 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:31 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:31 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:31 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.944 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Preparing to wait for external event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.945 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.945 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.945 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.946 187010 DEBUG nova.virt.libvirt.vif [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:58:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-903209475',display_name='tempest-TestNetworkBasicOps-server-903209475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-903209475',id=8,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDIB+sTAjOxDirmsYO12KwMWnYxz2zxnfGzNa7ZMfh7soCBbAbGVmdvajzfwY1shGPC4d9o9lbb4shCNor0v65EY185fBVmGjN7vRuQEh2+7lcFZO7btIaaCPI5shI3VHw==',key_name='tempest-TestNetworkBasicOps-116888488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-h6300wgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:58:29Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=7a4b7b8b-7dc3-454d-a5b3-70a319a09751,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.947 187010 DEBUG nova.network.os_vif_util [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.947 187010 DEBUG nova.network.os_vif_util [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.948 187010 DEBUG os_vif [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.948 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.949 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.949 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.953 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.954 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e968a26-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.955 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e968a26-a4, col_values=(('external_ids', {'iface-id': '9e968a26-a419-4ea6-9bbb-3dbbae785009', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:56:19', 'vm-uuid': '7a4b7b8b-7dc3-454d-a5b3-70a319a09751'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.956 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:31 np0005555140 NetworkManager[55531]: <info>  [1765447111.9578] manager: (tap9e968a26-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.959 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.963 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:31 np0005555140 nova_compute[187006]: 2025-12-11 09:58:31.964 187010 INFO os_vif [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4')#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.027 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.029 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.029 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:20:56:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.029 187010 INFO nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Using config drive#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.390 187010 INFO nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Creating config drive at /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.config#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.395 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ryhfa5c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.520 187010 DEBUG oslo_concurrency.processutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ryhfa5c" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:32 np0005555140 kernel: tap9e968a26-a4: entered promiscuous mode
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.5823] manager: (tap9e968a26-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Dec 11 04:58:32 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:32Z|00117|binding|INFO|Claiming lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 for this chassis.
Dec 11 04:58:32 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:32Z|00118|binding|INFO|9e968a26-a419-4ea6-9bbb-3dbbae785009: Claiming fa:16:3e:20:56:19 10.100.0.4
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.582 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.585 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.590 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.594 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.603 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7a4b7b8b-7dc3-454d-a5b3-70a319a09751', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.604 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 bound to our chassis#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.605 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99#033[00m
Dec 11 04:58:32 np0005555140 systemd-udevd[217113]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.615 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4080469d-7d0a-4720-97ee-9bdeaebea75d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.616 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2cb7ffc0-41 in ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.618 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2cb7ffc0-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.618 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d321ec-5f10-4848-8979-e5edfbc51693]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.619 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ceabde1d-254c-4599-ab7d-7ffcc59df416]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 systemd-machined[153398]: New machine qemu-8-instance-00000008.
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.6321] device (tap9e968a26-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.6337] device (tap9e968a26-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.634 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[6677525f-839c-4ab4-809a-7fc0bab78975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:32Z|00119|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 ovn-installed in OVS
Dec 11 04:58:32 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:32Z|00120|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 up in Southbound
Dec 11 04:58:32 np0005555140 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.648 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9eec71e9-6f59-43ba-a7d2-bde239d9a1c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.649 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.677 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[cba83847-4f01-4ad0-a1b7-23e0af788ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.683 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d30d15e1-82a4-4b75-947f-ad62da234c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.6850] manager: (tap2cb7ffc0-40): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Dec 11 04:58:32 np0005555140 systemd-udevd[217117]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.710 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a3a4a0-32a4-460b-abf3-f654089173e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.712 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc87c67-4a68-4246-9255-47da1f95e774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.7323] device (tap2cb7ffc0-40): carrier: link connected
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.740 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1dad23-52a8-4cb0-a167-6b944237395d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.757 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e2db94da-1cf4-4ed4-bbde-12d95e146cc8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb7ffc0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:50:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347130, 'reachable_time': 25632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217146, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.772 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4a16ac70-a4ff-4951-9c87-e728ab9b09e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec1:50b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347130, 'tstamp': 347130}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217147, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.792 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cadc5c-b017-462a-a14d-340facb04938]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb7ffc0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:50:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347130, 'reachable_time': 25632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217148, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.821 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc91c7e-93d5-405c-9d21-d1d7ab42229f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.872 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[c421417e-fabf-404a-b1c2-ab9bad6dda05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.873 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb7ffc0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.874 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.874 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb7ffc0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:32 np0005555140 kernel: tap2cb7ffc0-40: entered promiscuous mode
Dec 11 04:58:32 np0005555140 NetworkManager[55531]: <info>  [1765447112.8767] manager: (tap2cb7ffc0-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.876 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.879 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb7ffc0-40, col_values=(('external_ids', {'iface-id': '4ff438e0-383e-4ff9-ada6-ed08f969d9d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.880 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:32Z|00121|binding|INFO|Releasing lport 4ff438e0-383e-4ff9-ada6-ed08f969d9d1 from this chassis (sb_readonly=0)
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.881 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.881 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cce2c747-5639-4336-8d06-ed380ec30970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.882 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-2cb7ffc0-466a-4561-906a-8a8e91fabf99
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 2cb7ffc0-466a-4561-906a-8a8e91fabf99
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:58:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:32.882 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'env', 'PROCESS_TAG=haproxy-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2cb7ffc0-466a-4561-906a-8a8e91fabf99.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.890 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.943 187010 DEBUG nova.compute.manager [req-b5c4d8c1-2b13-4a1e-b975-cf0a25add1ec req-60d3dcfe-9dd1-48f0-909a-bad03038055d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.943 187010 DEBUG oslo_concurrency.lockutils [req-b5c4d8c1-2b13-4a1e-b975-cf0a25add1ec req-60d3dcfe-9dd1-48f0-909a-bad03038055d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.944 187010 DEBUG oslo_concurrency.lockutils [req-b5c4d8c1-2b13-4a1e-b975-cf0a25add1ec req-60d3dcfe-9dd1-48f0-909a-bad03038055d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.944 187010 DEBUG oslo_concurrency.lockutils [req-b5c4d8c1-2b13-4a1e-b975-cf0a25add1ec req-60d3dcfe-9dd1-48f0-909a-bad03038055d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:32 np0005555140 nova_compute[187006]: 2025-12-11 09:58:32.944 187010 DEBUG nova.compute.manager [req-b5c4d8c1-2b13-4a1e-b975-cf0a25add1ec req-60d3dcfe-9dd1-48f0-909a-bad03038055d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Processing event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.136 187010 DEBUG nova.network.neutron [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updated VIF entry in instance network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.137 187010 DEBUG nova.network.neutron [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updating instance_info_cache with network_info: [{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.164 187010 DEBUG oslo_concurrency.lockutils [req-2d534b07-27a1-4b02-8499-bcba86f24cc7 req-3fba07a9-a373-412e-ba70-e4203561b8d3 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.222 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447113.2217364, 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.222 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] VM Started (Lifecycle Event)#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.225 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.229 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.235 187010 INFO nova.virt.libvirt.driver [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Instance spawned successfully.#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.235 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:58:33 np0005555140 podman[217186]: 2025-12-11 09:58:33.242492441 +0000 UTC m=+0.050347229 container create 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.248 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.251 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.258 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.258 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.258 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.259 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.259 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.259 187010 DEBUG nova.virt.libvirt.driver [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.271 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.271 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447113.221923, 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.271 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:58:33 np0005555140 systemd[1]: Started libpod-conmon-87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347.scope.
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.292 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.295 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447113.2286367, 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.295 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:58:33 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:58:33 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8a5d731257df3dccbe40e91621ddf7ca7ef1089fa0a7d53a0444daee1cadb0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:58:33 np0005555140 podman[217186]: 2025-12-11 09:58:33.214375348 +0000 UTC m=+0.022230156 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:58:33 np0005555140 podman[217186]: 2025-12-11 09:58:33.315593889 +0000 UTC m=+0.123448697 container init 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:58:33 np0005555140 podman[217186]: 2025-12-11 09:58:33.320954572 +0000 UTC m=+0.128809360 container start 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.322 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.325 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.330 187010 INFO nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Took 3.82 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.331 187010 DEBUG nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:33 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [NOTICE]   (217206) : New worker (217208) forked
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.344 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:58:33 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [NOTICE]   (217206) : Loading success.
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.394 187010 INFO nova.compute.manager [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Took 4.31 seconds to build instance.#033[00m
Dec 11 04:58:33 np0005555140 nova_compute[187006]: 2025-12-11 09:58:33.409 187010 DEBUG oslo_concurrency.lockutils [None req-69ed5820-d6d0-4438-92e5-76d7d41b9a20 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.019 187010 DEBUG nova.compute.manager [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.021 187010 DEBUG oslo_concurrency.lockutils [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.021 187010 DEBUG oslo_concurrency.lockutils [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.021 187010 DEBUG oslo_concurrency.lockutils [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.021 187010 DEBUG nova.compute.manager [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] No waiting events found dispatching network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:35 np0005555140 nova_compute[187006]: 2025-12-11 09:58:35.022 187010 WARNING nova.compute.manager [req-ea05e564-0a94-4161-9467-84968631b5b6 req-40161ee8-db58-4bb9-bf49-0040a85da10c b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received unexpected event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with vm_state active and task_state None.#033[00m
Dec 11 04:58:36 np0005555140 nova_compute[187006]: 2025-12-11 09:58:36.814 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:36 np0005555140 nova_compute[187006]: 2025-12-11 09:58:36.957 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:37 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:37Z|00122|binding|INFO|Releasing lport 4ff438e0-383e-4ff9-ada6-ed08f969d9d1 from this chassis (sb_readonly=0)
Dec 11 04:58:37 np0005555140 NetworkManager[55531]: <info>  [1765447117.9496] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec 11 04:58:37 np0005555140 NetworkManager[55531]: <info>  [1765447117.9502] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Dec 11 04:58:37 np0005555140 nova_compute[187006]: 2025-12-11 09:58:37.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:37 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:37Z|00123|binding|INFO|Releasing lport 4ff438e0-383e-4ff9-ada6-ed08f969d9d1 from this chassis (sb_readonly=0)
Dec 11 04:58:37 np0005555140 nova_compute[187006]: 2025-12-11 09:58:37.981 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.555 187010 DEBUG nova.compute.manager [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.557 187010 DEBUG nova.compute.manager [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Refreshing instance network info cache due to event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.557 187010 DEBUG oslo_concurrency.lockutils [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.558 187010 DEBUG oslo_concurrency.lockutils [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.558 187010 DEBUG nova.network.neutron [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Refreshing network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:58:38 np0005555140 podman[217219]: 2025-12-11 09:58:38.711014727 +0000 UTC m=+0.069997991 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 04:58:38 np0005555140 podman[217218]: 2025-12-11 09:58:38.714877917 +0000 UTC m=+0.069892697 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.838 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.839 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.839 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.840 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.840 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.841 187010 INFO nova.compute.manager [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Terminating instance#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.842 187010 DEBUG nova.compute.manager [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:58:38 np0005555140 kernel: tap9e968a26-a4 (unregistering): left promiscuous mode
Dec 11 04:58:38 np0005555140 NetworkManager[55531]: <info>  [1765447118.8691] device (tap9e968a26-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:58:38 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:38Z|00124|binding|INFO|Releasing lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 from this chassis (sb_readonly=0)
Dec 11 04:58:38 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:38Z|00125|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 down in Southbound
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.878 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:38 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:38Z|00126|binding|INFO|Removing iface tap9e968a26-a4 ovn-installed in OVS
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.881 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:38 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:38.888 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7a4b7b8b-7dc3-454d-a5b3-70a319a09751', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:38 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:38.890 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 unbound from our chassis#033[00m
Dec 11 04:58:38 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:38.891 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:38 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:38.893 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[634b5fe9-b06c-4732-b669-7449da2f9212]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:38 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:38.893 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 namespace which is not needed anymore#033[00m
Dec 11 04:58:38 np0005555140 nova_compute[187006]: 2025-12-11 09:58:38.895 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:38 np0005555140 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 11 04:58:38 np0005555140 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 6.311s CPU time.
Dec 11 04:58:38 np0005555140 systemd-machined[153398]: Machine qemu-8-instance-00000008 terminated.
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [NOTICE]   (217206) : haproxy version is 2.8.14-c23fe91
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [NOTICE]   (217206) : path to executable is /usr/sbin/haproxy
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [WARNING]  (217206) : Exiting Master process...
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [WARNING]  (217206) : Exiting Master process...
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [ALERT]    (217206) : Current worker (217208) exited with code 143 (Terminated)
Dec 11 04:58:39 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217202]: [WARNING]  (217206) : All workers exited. Exiting... (0)
Dec 11 04:58:39 np0005555140 systemd[1]: libpod-87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347.scope: Deactivated successfully.
Dec 11 04:58:39 np0005555140 podman[217280]: 2025-12-11 09:58:39.025641483 +0000 UTC m=+0.047513658 container died 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 04:58:39 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347-userdata-shm.mount: Deactivated successfully.
Dec 11 04:58:39 np0005555140 systemd[1]: var-lib-containers-storage-overlay-a8a5d731257df3dccbe40e91621ddf7ca7ef1089fa0a7d53a0444daee1cadb0d-merged.mount: Deactivated successfully.
Dec 11 04:58:39 np0005555140 podman[217280]: 2025-12-11 09:58:39.0609148 +0000 UTC m=+0.082786975 container cleanup 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 04:58:39 np0005555140 kernel: tap9e968a26-a4: entered promiscuous mode
Dec 11 04:58:39 np0005555140 NetworkManager[55531]: <info>  [1765447119.0696] manager: (tap9e968a26-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00127|binding|INFO|Claiming lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 for this chassis.
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00128|binding|INFO|9e968a26-a419-4ea6-9bbb-3dbbae785009: Claiming fa:16:3e:20:56:19 10.100.0.4
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.069 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 kernel: tap9e968a26-a4 (unregistering): left promiscuous mode
Dec 11 04:58:39 np0005555140 systemd[1]: libpod-conmon-87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347.scope: Deactivated successfully.
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00129|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 ovn-installed in OVS
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00130|if_status|INFO|Dropped 2 log messages in last 40 seconds (most recently, 40 seconds ago) due to excessive rate
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00131|if_status|INFO|Not setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 down as sb is readonly
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.091 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:39Z|00132|binding|INFO|Releasing lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 from this chassis (sb_readonly=0)
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.103 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7a4b7b8b-7dc3-454d-a5b3-70a319a09751', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.112 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.113 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7a4b7b8b-7dc3-454d-a5b3-70a319a09751', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.120 187010 INFO nova.virt.libvirt.driver [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Instance destroyed successfully.#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.121 187010 DEBUG nova.objects.instance [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:39 np0005555140 podman[217311]: 2025-12-11 09:58:39.137611901 +0000 UTC m=+0.050594726 container remove 87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.142 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[217a3341-ad3b-4753-9036-7b56e3c214e7]: (4, ('Thu Dec 11 09:58:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 (87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347)\n87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347\nThu Dec 11 09:58:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 (87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347)\n87e0be1ce83fda29f7ac2fb35e517475398f4185cc0e2b3d6fd2ca4eae18c347\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.144 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[8626ea36-7ad8-4d8d-8f08-8ca892a490b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.145 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb7ffc0-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.145 187010 DEBUG nova.virt.libvirt.vif [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:58:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-903209475',display_name='tempest-TestNetworkBasicOps-server-903209475',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-903209475',id=8,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDIB+sTAjOxDirmsYO12KwMWnYxz2zxnfGzNa7ZMfh7soCBbAbGVmdvajzfwY1shGPC4d9o9lbb4shCNor0v65EY185fBVmGjN7vRuQEh2+7lcFZO7btIaaCPI5shI3VHw==',key_name='tempest-TestNetworkBasicOps-116888488',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:58:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-h6300wgu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:58:33Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=7a4b7b8b-7dc3-454d-a5b3-70a319a09751,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.147 187010 DEBUG nova.network.os_vif_util [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:39 np0005555140 kernel: tap2cb7ffc0-40: left promiscuous mode
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.148 187010 DEBUG nova.network.os_vif_util [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.149 187010 DEBUG os_vif [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.152 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.153 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e968a26-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.155 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.162 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.165 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0db6b791-cdc2-4b26-9298-8365f6849969]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.167 187010 INFO os_vif [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4')#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.168 187010 INFO nova.virt.libvirt.driver [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Deleting instance files /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751_del#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.169 187010 INFO nova.virt.libvirt.driver [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Deletion of /var/lib/nova/instances/7a4b7b8b-7dc3-454d-a5b3-70a319a09751_del complete#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.178 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f124c71d-5c28-403d-8a33-2dfbd26f8cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.180 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5d62ef49-4345-49f5-bc1a-617748a0b18b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.194 187010 DEBUG nova.compute.manager [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.194 187010 DEBUG oslo_concurrency.lockutils [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.195 187010 DEBUG oslo_concurrency.lockutils [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.195 187010 DEBUG oslo_concurrency.lockutils [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.195 187010 DEBUG nova.compute.manager [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] No waiting events found dispatching network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.195 187010 DEBUG nova.compute.manager [req-1fc8850b-67fc-4843-b850-1f1294df3043 req-97a87452-7916-429a-9ceb-21ec866e1c40 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.197 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[95e1beb1-8cad-4d11-928d-e692251f749a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347124, 'reachable_time': 35531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217336, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 systemd[1]: run-netns-ovnmeta\x2d2cb7ffc0\x2d466a\x2d4561\x2d906a\x2d8a8e91fabf99.mount: Deactivated successfully.
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.202 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.202 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd1d8b4-f4d1-4e6c-a135-32a8ff0794b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.203 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 unbound from our chassis#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.204 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.205 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[90e33697-088a-49fc-87fa-fbe40fe0d1e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.205 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 unbound from our chassis#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.206 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:39.206 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7bba4eff-3dfe-4baa-aa16-8df33a22768d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.234 187010 INFO nova.compute.manager [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.235 187010 DEBUG oslo.service.loopingcall [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.235 187010 DEBUG nova.compute.manager [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:58:39 np0005555140 nova_compute[187006]: 2025-12-11 09:58:39.236 187010 DEBUG nova.network.neutron [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.388 187010 DEBUG nova.network.neutron [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updated VIF entry in instance network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.389 187010 DEBUG nova.network.neutron [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updating instance_info_cache with network_info: [{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.418 187010 DEBUG oslo_concurrency.lockutils [req-7afe5c07-da22-4d44-a446-ab4aa821bd19 req-958e233e-21e6-4801-8f71-a7a013fce3b4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-7a4b7b8b-7dc3-454d-a5b3-70a319a09751" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.484 187010 DEBUG nova.network.neutron [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.498 187010 INFO nova.compute.manager [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Took 1.26 seconds to deallocate network for instance.#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.552 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.552 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.603 187010 DEBUG nova.compute.provider_tree [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.618 187010 DEBUG nova.scheduler.client.report [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.636 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.665 187010 INFO nova.scheduler.client.report [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 7a4b7b8b-7dc3-454d-a5b3-70a319a09751#033[00m
Dec 11 04:58:40 np0005555140 nova_compute[187006]: 2025-12-11 09:58:40.736 187010 DEBUG oslo_concurrency.lockutils [None req-17d63571-a872-45d0-b38a-aaa0eb62c2f7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.262 187010 DEBUG nova.compute.manager [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.263 187010 DEBUG oslo_concurrency.lockutils [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.263 187010 DEBUG oslo_concurrency.lockutils [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.263 187010 DEBUG oslo_concurrency.lockutils [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "7a4b7b8b-7dc3-454d-a5b3-70a319a09751-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.264 187010 DEBUG nova.compute.manager [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] No waiting events found dispatching network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.264 187010 WARNING nova.compute.manager [req-94b459e2-d62e-409d-8986-914491a3afa5 req-c152d938-6f12-4921-b975-2b07e85e2623 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Received unexpected event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:58:41 np0005555140 nova_compute[187006]: 2025-12-11 09:58:41.815 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:43 np0005555140 podman[217337]: 2025-12-11 09:58:43.708522781 +0000 UTC m=+0.069793295 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec 11 04:58:44 np0005555140 nova_compute[187006]: 2025-12-11 09:58:44.154 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.314 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.315 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.332 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.394 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.395 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.401 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.402 187010 INFO nova.compute.claims [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.505 187010 DEBUG nova.compute.provider_tree [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.528 187010 DEBUG nova.scheduler.client.report [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.550 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.551 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.602 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.603 187010 DEBUG nova.network.neutron [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.627 187010 INFO nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.651 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.740 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.742 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.743 187010 INFO nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Creating image(s)#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.744 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.744 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.745 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.762 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.845 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.846 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.847 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.866 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.926 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.927 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.986 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.987 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:45 np0005555140 nova_compute[187006]: 2025-12-11 09:58:45.988 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.041 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.042 187010 DEBUG nova.virt.disk.api [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.043 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.111 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.112 187010 DEBUG nova.virt.disk.api [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.112 187010 DEBUG nova.objects.instance [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 144ea666-8a03-4938-9389-b9c1ba226d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.165 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.166 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Ensure instance console log exists: /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.167 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.167 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.168 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.223 187010 DEBUG nova.policy [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:58:46 np0005555140 nova_compute[187006]: 2025-12-11 09:58:46.817 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.587 187010 DEBUG nova.network.neutron [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Successfully updated port: 9e968a26-a419-4ea6-9bbb-3dbbae785009 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.614 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.615 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.616 187010 DEBUG nova.network.neutron [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:58:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:48.625 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:48 np0005555140 podman[217371]: 2025-12-11 09:58:48.68802203 +0000 UTC m=+0.058929604 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.690 187010 DEBUG nova.compute.manager [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.690 187010 DEBUG nova.compute.manager [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Refreshing instance network info cache due to event network-changed-9e968a26-a419-4ea6-9bbb-3dbbae785009. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.691 187010 DEBUG oslo_concurrency.lockutils [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:58:48 np0005555140 nova_compute[187006]: 2025-12-11 09:58:48.739 187010 DEBUG nova.network.neutron [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:58:49 np0005555140 nova_compute[187006]: 2025-12-11 09:58:49.155 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:50 np0005555140 podman[217396]: 2025-12-11 09:58:50.67957555 +0000 UTC m=+0.048248919 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 04:58:50 np0005555140 podman[217395]: 2025-12-11 09:58:50.702117884 +0000 UTC m=+0.075512758 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 04:58:51 np0005555140 nova_compute[187006]: 2025-12-11 09:58:51.820 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:51 np0005555140 nova_compute[187006]: 2025-12-11 09:58:51.889 187010 DEBUG nova.network.neutron [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Updating instance_info_cache with network_info: [{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.067 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.068 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Instance network_info: |[{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.068 187010 DEBUG oslo_concurrency.lockutils [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.068 187010 DEBUG nova.network.neutron [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Refreshing network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.072 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Start _get_guest_xml network_info=[{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.078 187010 WARNING nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.085 187010 DEBUG nova.virt.libvirt.host [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.086 187010 DEBUG nova.virt.libvirt.host [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.092 187010 DEBUG nova.virt.libvirt.host [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.093 187010 DEBUG nova.virt.libvirt.host [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.093 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.093 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.094 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.094 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.094 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.094 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.094 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.095 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.095 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.095 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.095 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.095 187010 DEBUG nova.virt.hardware [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.099 187010 DEBUG nova.virt.libvirt.vif [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:58:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021355631',display_name='tempest-TestNetworkBasicOps-server-1021355631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021355631',id=9,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPR1v1GRZMm4nA1O63O4z+BOxCvX+aqxbcmuDrUQ8gPLQonPjcJrEqNNo0GdJHlbOfcvWmwNMadNbuUasOStbq3L/P2Kd1awZwlemB6vbsP/j+H/ZJtoSene4uBcFfB2A==',key_name='tempest-TestNetworkBasicOps-1260814663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-yugg95lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:58:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=144ea666-8a03-4938-9389-b9c1ba226d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.099 187010 DEBUG nova.network.os_vif_util [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.100 187010 DEBUG nova.network.os_vif_util [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.101 187010 DEBUG nova.objects.instance [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 144ea666-8a03-4938-9389-b9c1ba226d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.113 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <uuid>144ea666-8a03-4938-9389-b9c1ba226d18</uuid>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <name>instance-00000009</name>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1021355631</nova:name>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:58:52</nova:creationTime>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        <nova:port uuid="9e968a26-a419-4ea6-9bbb-3dbbae785009">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="serial">144ea666-8a03-4938-9389-b9c1ba226d18</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="uuid">144ea666-8a03-4938-9389-b9c1ba226d18</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.config"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:20:56:19"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <target dev="tap9e968a26-a4"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/console.log" append="off"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:58:52 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:58:52 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:58:52 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:58:52 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.114 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Preparing to wait for external event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.115 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.116 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.116 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.117 187010 DEBUG nova.virt.libvirt.vif [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:58:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021355631',display_name='tempest-TestNetworkBasicOps-server-1021355631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021355631',id=9,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPR1v1GRZMm4nA1O63O4z+BOxCvX+aqxbcmuDrUQ8gPLQonPjcJrEqNNo0GdJHlbOfcvWmwNMadNbuUasOStbq3L/P2Kd1awZwlemB6vbsP/j+H/ZJtoSene4uBcFfB2A==',key_name='tempest-TestNetworkBasicOps-1260814663',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-yugg95lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:58:45Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=144ea666-8a03-4938-9389-b9c1ba226d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.117 187010 DEBUG nova.network.os_vif_util [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.118 187010 DEBUG nova.network.os_vif_util [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.118 187010 DEBUG os_vif [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.119 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.119 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.120 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.122 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.123 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e968a26-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.123 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e968a26-a4, col_values=(('external_ids', {'iface-id': '9e968a26-a419-4ea6-9bbb-3dbbae785009', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:56:19', 'vm-uuid': '144ea666-8a03-4938-9389-b9c1ba226d18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.125 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:52 np0005555140 NetworkManager[55531]: <info>  [1765447132.1262] manager: (tap9e968a26-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.128 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.129 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.130 187010 INFO os_vif [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4')#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.178 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.178 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.179 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:20:56:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:58:52 np0005555140 nova_compute[187006]: 2025-12-11 09:58:52.179 187010 INFO nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Using config drive#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.103 187010 INFO nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Creating config drive at /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.config#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.112 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mzpkvfw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.249 187010 DEBUG oslo_concurrency.processutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mzpkvfw" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:58:53 np0005555140 kernel: tap9e968a26-a4: entered promiscuous mode
Dec 11 04:58:53 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:53Z|00133|binding|INFO|Claiming lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 for this chassis.
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.318 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:53Z|00134|binding|INFO|9e968a26-a419-4ea6-9bbb-3dbbae785009: Claiming fa:16:3e:20:56:19 10.100.0.4
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.3205] manager: (tap9e968a26-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec 11 04:58:53 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:53Z|00135|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 ovn-installed in OVS
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.330 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.334 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:53Z|00136|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 up in Southbound
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.337 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '144ea666-8a03-4938-9389-b9c1ba226d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.339 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 bound to our chassis#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.341 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.351 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[56972b1c-5bc9-4b09-ad53-385d391b402f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.352 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2cb7ffc0-41 in ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.355 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2cb7ffc0-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.355 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3e332a3d-6aa2-4ff4-9436-2d52a186aeba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 systemd-machined[153398]: New machine qemu-9-instance-00000009.
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.357 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[8a11633e-4558-4d86-b2d7-4c4f55d98a53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.369 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[29656546-02b8-47c6-848a-cb2da0e7c0a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Dec 11 04:58:53 np0005555140 systemd-udevd[217464]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.381 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ce60a9-f987-4b68-b6a1-3cb5f5585d13]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.3899] device (tap9e968a26-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.3907] device (tap9e968a26-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.408 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c157c7-0cc3-42e4-b6e3-bcc81f4263bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 systemd-udevd[217467]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.4142] manager: (tap2cb7ffc0-40): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.414 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[de74499c-2636-46c7-91b4-b49dce09d492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.444 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[4a041846-9709-40c6-9063-ef4024304f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.448 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[6f22f7a8-45bd-4ca4-92a5-eb6ecbba7044]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.4720] device (tap2cb7ffc0-40): carrier: link connected
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.476 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[73d27e83-93f1-43dd-bb7d-acb03470364c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.493 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f88b3455-8679-4e2b-a09d-84aa4bf82070]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb7ffc0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:50:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349204, 'reachable_time': 35625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217494, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.507 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d8803eb0-3b2d-4690-9766-13829dff81fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec1:50b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349204, 'tstamp': 349204}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217495, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.526 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ca721d-3845-4d58-aa41-480f12094c1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2cb7ffc0-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c1:50:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349204, 'reachable_time': 35625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217496, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.554 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[16e68956-fcb6-4038-a2e5-eca4260440fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.611 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a25091f0-8bf0-4499-93d8-7519df08cf26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.614 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb7ffc0-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.614 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.615 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2cb7ffc0-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:53 np0005555140 NetworkManager[55531]: <info>  [1765447133.6173] manager: (tap2cb7ffc0-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Dec 11 04:58:53 np0005555140 kernel: tap2cb7ffc0-40: entered promiscuous mode
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.618 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.619 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2cb7ffc0-40, col_values=(('external_ids', {'iface-id': '4ff438e0-383e-4ff9-ada6-ed08f969d9d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:53 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:53Z|00137|binding|INFO|Releasing lport 4ff438e0-383e-4ff9-ada6-ed08f969d9d1 from this chassis (sb_readonly=0)
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.620 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.621 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.622 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.622 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[36bd5889-6802-4baa-90e1-5767d24ab655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.623 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-2cb7ffc0-466a-4561-906a-8a8e91fabf99
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/2cb7ffc0-466a-4561-906a-8a8e91fabf99.pid.haproxy
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 2cb7ffc0-466a-4561-906a-8a8e91fabf99
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:58:53 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:53.624 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'env', 'PROCESS_TAG=haproxy-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2cb7ffc0-466a-4561-906a-8a8e91fabf99.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.636 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.674 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447133.673114, 144ea666-8a03-4938-9389-b9c1ba226d18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.674 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] VM Started (Lifecycle Event)#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.704 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.709 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447133.6734474, 144ea666-8a03-4938-9389-b9c1ba226d18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.709 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.736 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.739 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.758 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.889 187010 DEBUG nova.compute.manager [req-1e22e83a-08c1-4cd6-a69b-83564beaa845 req-48c68462-a2df-4806-8409-45b2255fece0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.890 187010 DEBUG oslo_concurrency.lockutils [req-1e22e83a-08c1-4cd6-a69b-83564beaa845 req-48c68462-a2df-4806-8409-45b2255fece0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.890 187010 DEBUG oslo_concurrency.lockutils [req-1e22e83a-08c1-4cd6-a69b-83564beaa845 req-48c68462-a2df-4806-8409-45b2255fece0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.890 187010 DEBUG oslo_concurrency.lockutils [req-1e22e83a-08c1-4cd6-a69b-83564beaa845 req-48c68462-a2df-4806-8409-45b2255fece0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.890 187010 DEBUG nova.compute.manager [req-1e22e83a-08c1-4cd6-a69b-83564beaa845 req-48c68462-a2df-4806-8409-45b2255fece0 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Processing event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.892 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.895 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447133.894737, 144ea666-8a03-4938-9389-b9c1ba226d18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.895 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.897 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.901 187010 INFO nova.virt.libvirt.driver [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Instance spawned successfully.#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.901 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.930 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.936 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.940 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.941 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.941 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.941 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.942 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.942 187010 DEBUG nova.virt.libvirt.driver [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:58:53 np0005555140 nova_compute[187006]: 2025-12-11 09:58:53.971 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.010 187010 INFO nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Took 8.27 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.011 187010 DEBUG nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.090 187010 INFO nova.compute.manager [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Took 8.72 seconds to build instance.#033[00m
Dec 11 04:58:54 np0005555140 podman[217534]: 2025-12-11 09:58:53.994639141 +0000 UTC m=+0.038150481 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.119 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447119.1179829, 7a4b7b8b-7dc3-454d-a5b3-70a319a09751 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.120 187010 INFO nova.compute.manager [-] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:58:54 np0005555140 podman[217534]: 2025-12-11 09:58:54.144829631 +0000 UTC m=+0.188340951 container create 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.154 187010 DEBUG oslo_concurrency.lockutils [None req-da3c211a-5287-4471-8808-9a9b0a2c620d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.158 187010 DEBUG nova.compute.manager [None req-aabe3c13-81f9-4ac3-85ee-3bc3de1c95da - - - - - -] [instance: 7a4b7b8b-7dc3-454d-a5b3-70a319a09751] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:58:54 np0005555140 systemd[1]: Started libpod-conmon-9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15.scope.
Dec 11 04:58:54 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:58:54 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/186caed72a072f05a578eea253b8a331861f28a4e23e62457fef723bed76b06f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:58:54 np0005555140 podman[217534]: 2025-12-11 09:58:54.248484271 +0000 UTC m=+0.291995611 container init 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 11 04:58:54 np0005555140 podman[217534]: 2025-12-11 09:58:54.255698547 +0000 UTC m=+0.299209857 container start 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.263 187010 DEBUG nova.network.neutron [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Updated VIF entry in instance network info cache for port 9e968a26-a419-4ea6-9bbb-3dbbae785009. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.263 187010 DEBUG nova.network.neutron [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Updating instance_info_cache with network_info: [{"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:54 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [NOTICE]   (217554) : New worker (217556) forked
Dec 11 04:58:54 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [NOTICE]   (217554) : Loading success.
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.325 187010 DEBUG oslo_concurrency.lockutils [req-fc33fe63-76d9-4466-b3d6-5c2097445748 req-5343e53a-a1f6-46f0-8bf1-e0565bed957d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-144ea666-8a03-4938-9389-b9c1ba226d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:58:54 np0005555140 nova_compute[187006]: 2025-12-11 09:58:54.853 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.568 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.568 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.569 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.569 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.569 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.570 187010 INFO nova.compute.manager [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Terminating instance#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.571 187010 DEBUG nova.compute.manager [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 04:58:55 np0005555140 kernel: tap9e968a26-a4 (unregistering): left promiscuous mode
Dec 11 04:58:55 np0005555140 NetworkManager[55531]: <info>  [1765447135.5992] device (tap9e968a26-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.610 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:55Z|00138|binding|INFO|Releasing lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 from this chassis (sb_readonly=0)
Dec 11 04:58:55 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:55Z|00139|binding|INFO|Setting lport 9e968a26-a419-4ea6-9bbb-3dbbae785009 down in Southbound
Dec 11 04:58:55 np0005555140 ovn_controller[95438]: 2025-12-11T09:58:55Z|00140|binding|INFO|Removing iface tap9e968a26-a4 ovn-installed in OVS
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.612 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.619 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:56:19 10.100.0.4'], port_security=['fa:16:3e:20:56:19 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '144ea666-8a03-4938-9389-b9c1ba226d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1532882399', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'ba7b1513-0727-42bc-a2ab-eb3e0da7cbf2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.215', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce84cee6-877f-4408-829a-c39275b1710b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=9e968a26-a419-4ea6-9bbb-3dbbae785009) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.622 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 9e968a26-a419-4ea6-9bbb-3dbbae785009 in datapath 2cb7ffc0-466a-4561-906a-8a8e91fabf99 unbound from our chassis#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.624 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2cb7ffc0-466a-4561-906a-8a8e91fabf99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.625 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ffba1dc0-3368-4acc-ba38-b656a099f8df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.626 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 namespace which is not needed anymore#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.631 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 11 04:58:55 np0005555140 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 1.998s CPU time.
Dec 11 04:58:55 np0005555140 systemd-machined[153398]: Machine qemu-9-instance-00000009 terminated.
Dec 11 04:58:55 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [NOTICE]   (217554) : haproxy version is 2.8.14-c23fe91
Dec 11 04:58:55 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [NOTICE]   (217554) : path to executable is /usr/sbin/haproxy
Dec 11 04:58:55 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [WARNING]  (217554) : Exiting Master process...
Dec 11 04:58:55 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [ALERT]    (217554) : Current worker (217556) exited with code 143 (Terminated)
Dec 11 04:58:55 np0005555140 neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99[217550]: [WARNING]  (217554) : All workers exited. Exiting... (0)
Dec 11 04:58:55 np0005555140 systemd[1]: libpod-9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15.scope: Deactivated successfully.
Dec 11 04:58:55 np0005555140 podman[217587]: 2025-12-11 09:58:55.757480769 +0000 UTC m=+0.051387169 container died 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 04:58:55 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15-userdata-shm.mount: Deactivated successfully.
Dec 11 04:58:55 np0005555140 NetworkManager[55531]: <info>  [1765447135.7954] manager: (tap9e968a26-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Dec 11 04:58:55 np0005555140 systemd[1]: var-lib-containers-storage-overlay-186caed72a072f05a578eea253b8a331861f28a4e23e62457fef723bed76b06f-merged.mount: Deactivated successfully.
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.799 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.806 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 podman[217587]: 2025-12-11 09:58:55.810279527 +0000 UTC m=+0.104185927 container cleanup 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 04:58:55 np0005555140 systemd[1]: libpod-conmon-9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15.scope: Deactivated successfully.
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.853 187010 INFO nova.virt.libvirt.driver [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Instance destroyed successfully.#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.853 187010 DEBUG nova.objects.instance [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 144ea666-8a03-4938-9389-b9c1ba226d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.882 187010 DEBUG nova.virt.libvirt.vif [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:58:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021355631',display_name='tempest-TestNetworkBasicOps-server-1021355631',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021355631',id=9,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJPR1v1GRZMm4nA1O63O4z+BOxCvX+aqxbcmuDrUQ8gPLQonPjcJrEqNNo0GdJHlbOfcvWmwNMadNbuUasOStbq3L/P2Kd1awZwlemB6vbsP/j+H/ZJtoSene4uBcFfB2A==',key_name='tempest-TestNetworkBasicOps-1260814663',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:58:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-yugg95lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:58:54Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=144ea666-8a03-4938-9389-b9c1ba226d18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.883 187010 DEBUG nova.network.os_vif_util [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "address": "fa:16:3e:20:56:19", "network": {"id": "2cb7ffc0-466a-4561-906a-8a8e91fabf99", "bridge": "br-int", "label": "tempest-network-smoke--582870543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e968a26-a4", "ovs_interfaceid": "9e968a26-a419-4ea6-9bbb-3dbbae785009", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.883 187010 DEBUG nova.network.os_vif_util [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.884 187010 DEBUG os_vif [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.885 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.886 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e968a26-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.889 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.893 187010 INFO os_vif [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:56:19,bridge_name='br-int',has_traffic_filtering=True,id=9e968a26-a419-4ea6-9bbb-3dbbae785009,network=Network(2cb7ffc0-466a-4561-906a-8a8e91fabf99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9e968a26-a4')#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.894 187010 INFO nova.virt.libvirt.driver [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Deleting instance files /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18_del#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.895 187010 INFO nova.virt.libvirt.driver [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Deletion of /var/lib/nova/instances/144ea666-8a03-4938-9389-b9c1ba226d18_del complete#033[00m
Dec 11 04:58:55 np0005555140 podman[217628]: 2025-12-11 09:58:55.900971727 +0000 UTC m=+0.062887417 container remove 9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.905 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cf153df8-7d18-4bb5-8ad7-1adc303084ac]: (4, ('Thu Dec 11 09:58:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 (9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15)\n9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15\nThu Dec 11 09:58:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 (9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15)\n9cd06c0a108da26e5f0a8884aa380d14097efd9a54e8a051f6c85392819c9f15\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.907 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[768595a0-811f-4c27-abb1-58e35a9b3939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.908 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2cb7ffc0-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.910 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 kernel: tap2cb7ffc0-40: left promiscuous mode
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.914 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1a47a8-d634-443c-950c-348b9066900e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.926 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.936 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6656330b-ba46-446b-b744-8319b8bc0b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.938 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[64120bd8-706b-4df0-81f0-14561475c5a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.944 187010 INFO nova.compute.manager [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.945 187010 DEBUG oslo.service.loopingcall [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.945 187010 DEBUG nova.compute.manager [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.946 187010 DEBUG nova.network.neutron [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.957 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0dbd754b-5e35-4919-b68b-cf3e6fb49427]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349197, 'reachable_time': 42248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217647, 'error': None, 'target': 'ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.960 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2cb7ffc0-466a-4561-906a-8a8e91fabf99 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 04:58:55 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:55.961 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7e69ba-77ea-4ba0-9c68-fa6992f6eb57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:58:55 np0005555140 systemd[1]: run-netns-ovnmeta\x2d2cb7ffc0\x2d466a\x2d4561\x2d906a\x2d8a8e91fabf99.mount: Deactivated successfully.
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.967 187010 DEBUG nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.967 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.967 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.967 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 DEBUG nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] No waiting events found dispatching network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 WARNING nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received unexpected event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 DEBUG nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.968 187010 DEBUG oslo_concurrency.lockutils [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.969 187010 DEBUG nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] No waiting events found dispatching network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:55 np0005555140 nova_compute[187006]: 2025-12-11 09:58:55.969 187010 DEBUG nova.compute.manager [req-74be3822-26bc-438b-91af-878206ebb558 req-06b12425-f283-41cf-975e-0f7e01ef03ba b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-vif-unplugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 04:58:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:56.775 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.776 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:56.777 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:58:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:58:56.778 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.823 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.897 187010 DEBUG nova.network.neutron [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.923 187010 INFO nova.compute.manager [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Took 0.98 seconds to deallocate network for instance.#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.978 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:56 np0005555140 nova_compute[187006]: 2025-12-11 09:58:56.979 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.045 187010 DEBUG nova.compute.provider_tree [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.066 187010 DEBUG nova.scheduler.client.report [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.099 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.125 187010 INFO nova.scheduler.client.report [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 144ea666-8a03-4938-9389-b9c1ba226d18#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.202 187010 DEBUG oslo_concurrency.lockutils [None req-5ae4c7fa-a594-4172-b37a-e5b40ed06647 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:57 np0005555140 nova_compute[187006]: 2025-12-11 09:58:57.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.034 187010 DEBUG nova.compute.manager [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.035 187010 DEBUG oslo_concurrency.lockutils [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.035 187010 DEBUG oslo_concurrency.lockutils [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.035 187010 DEBUG oslo_concurrency.lockutils [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "144ea666-8a03-4938-9389-b9c1ba226d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.035 187010 DEBUG nova.compute.manager [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] No waiting events found dispatching network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.035 187010 WARNING nova.compute.manager [req-9ecb433b-1678-48f8-a4c5-3e1a6ca742fe req-bd7a60fe-4bd9-4057-be3a-930094fb622a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Received unexpected event network-vif-plugged-9e968a26-a419-4ea6-9bbb-3dbbae785009 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.846 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.847 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.867 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.868 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.868 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:58:58 np0005555140 nova_compute[187006]: 2025-12-11 09:58:58.868 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.049 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.050 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.32846450805664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.051 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.051 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.098 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.099 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.115 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.134 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.154 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 04:58:59 np0005555140 nova_compute[187006]: 2025-12-11 09:58:59.155 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:00 np0005555140 podman[217649]: 2025-12-11 09:59:00.6820442 +0000 UTC m=+0.054792566 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 04:59:00 np0005555140 nova_compute[187006]: 2025-12-11 09:59:00.890 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:01 np0005555140 nova_compute[187006]: 2025-12-11 09:59:01.824 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:04 np0005555140 nova_compute[187006]: 2025-12-11 09:59:04.136 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:04 np0005555140 nova_compute[187006]: 2025-12-11 09:59:04.166 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:04 np0005555140 nova_compute[187006]: 2025-12-11 09:59:04.240 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:05 np0005555140 nova_compute[187006]: 2025-12-11 09:59:05.893 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:06 np0005555140 nova_compute[187006]: 2025-12-11 09:59:06.827 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:09 np0005555140 podman[217676]: 2025-12-11 09:59:09.700790085 +0000 UTC m=+0.066035837 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 11 04:59:09 np0005555140 podman[217675]: 2025-12-11 09:59:09.744951197 +0000 UTC m=+0.108616674 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 11 04:59:10 np0005555140 nova_compute[187006]: 2025-12-11 09:59:10.850 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447135.8497446, 144ea666-8a03-4938-9389-b9c1ba226d18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:59:10 np0005555140 nova_compute[187006]: 2025-12-11 09:59:10.851 187010 INFO nova.compute.manager [-] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] VM Stopped (Lifecycle Event)#033[00m
Dec 11 04:59:10 np0005555140 nova_compute[187006]: 2025-12-11 09:59:10.897 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:11 np0005555140 nova_compute[187006]: 2025-12-11 09:59:11.828 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:13 np0005555140 nova_compute[187006]: 2025-12-11 09:59:13.989 187010 DEBUG nova.compute.manager [None req-da02e5f2-ae6b-44ea-ad3c-4bf76af76261 - - - - - -] [instance: 144ea666-8a03-4938-9389-b9c1ba226d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:59:14 np0005555140 podman[217718]: 2025-12-11 09:59:14.672682727 +0000 UTC m=+0.050883664 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 04:59:15 np0005555140 nova_compute[187006]: 2025-12-11 09:59:15.899 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:16 np0005555140 nova_compute[187006]: 2025-12-11 09:59:16.831 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:19 np0005555140 podman[217738]: 2025-12-11 09:59:19.66963509 +0000 UTC m=+0.048599403 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 04:59:20 np0005555140 nova_compute[187006]: 2025-12-11 09:59:20.901 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:21 np0005555140 podman[217762]: 2025-12-11 09:59:21.719724858 +0000 UTC m=+0.089150825 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Dec 11 04:59:21 np0005555140 podman[217763]: 2025-12-11 09:59:21.723446764 +0000 UTC m=+0.077025477 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 11 04:59:21 np0005555140 nova_compute[187006]: 2025-12-11 09:59:21.831 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:25 np0005555140 nova_compute[187006]: 2025-12-11 09:59:25.904 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:26 np0005555140 nova_compute[187006]: 2025-12-11 09:59:26.833 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.459 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.460 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.488 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.648 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.649 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.657 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.657 187010 INFO nova.compute.claims [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.817 187010 DEBUG nova.compute.provider_tree [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.833 187010 DEBUG nova.scheduler.client.report [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.851 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.852 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.894 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.894 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.907 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.919 187010 INFO nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 04:59:30 np0005555140 nova_compute[187006]: 2025-12-11 09:59:30.938 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.051 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.052 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.052 187010 INFO nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Creating image(s)#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.053 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.053 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.054 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.066 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.132 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.133 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.134 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.144 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.195 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.196 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:31 np0005555140 podman[217817]: 2025-12-11 09:59:31.690027215 +0000 UTC m=+0.054566994 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.782 187010 DEBUG nova.policy [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.809 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk 1073741824" returned: 0 in 0.612s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.809 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.810 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.836 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.895 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.896 187010 DEBUG nova.virt.disk.api [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.897 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.967 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.968 187010 DEBUG nova.virt.disk.api [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.969 187010 DEBUG nova.objects.instance [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid e861be47-625e-4830-ba0d-e45eced42fe6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.982 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.983 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Ensure instance console log exists: /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.984 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.984 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:31 np0005555140 nova_compute[187006]: 2025-12-11 09:59:31.984 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:34 np0005555140 nova_compute[187006]: 2025-12-11 09:59:34.781 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Successfully created port: 2eb0e5af-4006-431a-890b-a8cc811acafe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 04:59:35 np0005555140 nova_compute[187006]: 2025-12-11 09:59:35.911 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.001 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Successfully updated port: 2eb0e5af-4006-431a-890b-a8cc811acafe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.021 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.022 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.022 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.104 187010 DEBUG nova.compute.manager [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.105 187010 DEBUG nova.compute.manager [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing instance network info cache due to event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.105 187010 DEBUG oslo_concurrency.lockutils [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.635 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 04:59:36 np0005555140 nova_compute[187006]: 2025-12-11 09:59:36.838 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.809 187010 DEBUG nova.network.neutron [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.831 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.832 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Instance network_info: |[{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.832 187010 DEBUG oslo_concurrency.lockutils [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.832 187010 DEBUG nova.network.neutron [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.835 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Start _get_guest_xml network_info=[{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.840 187010 WARNING nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.845 187010 DEBUG nova.virt.libvirt.host [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.845 187010 DEBUG nova.virt.libvirt.host [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.849 187010 DEBUG nova.virt.libvirt.host [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.849 187010 DEBUG nova.virt.libvirt.host [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.850 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.850 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.850 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.851 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.851 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.851 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.851 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.852 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.852 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.852 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.852 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.853 187010 DEBUG nova.virt.hardware [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.857 187010 DEBUG nova.virt.libvirt.vif [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095390827',display_name='tempest-TestNetworkBasicOps-server-2095390827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095390827',id=10,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBJTRNPYFwJw3SQZr4ROevEM5duu4m1YhuxEhBAqhG2Y11PAdG6TG7FM7xOCWCJLl0z/aX/t188s0dkX9sSWjb0QAj8w0vF2wJi3qc/ps4dx4WKZO0Q4XWGyhVjy86v35g==',key_name='tempest-TestNetworkBasicOps-294158070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-elyzwnhg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:59:30Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e861be47-625e-4830-ba0d-e45eced42fe6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.857 187010 DEBUG nova.network.os_vif_util [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.858 187010 DEBUG nova.network.os_vif_util [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.859 187010 DEBUG nova.objects.instance [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid e861be47-625e-4830-ba0d-e45eced42fe6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.874 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] End _get_guest_xml xml=<domain type="kvm">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <uuid>e861be47-625e-4830-ba0d-e45eced42fe6</uuid>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <name>instance-0000000a</name>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-2095390827</nova:name>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 09:59:37</nova:creationTime>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        <nova:port uuid="2eb0e5af-4006-431a-890b-a8cc811acafe">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <system>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="serial">e861be47-625e-4830-ba0d-e45eced42fe6</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="uuid">e861be47-625e-4830-ba0d-e45eced42fe6</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </system>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <os>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </os>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <features>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </features>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </clock>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  <devices>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.config"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </disk>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:5b:31:62"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <target dev="tap2eb0e5af-40"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </interface>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/console.log" append="off"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </serial>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <video>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </video>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </rng>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 04:59:37 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 04:59:37 np0005555140 nova_compute[187006]:  </devices>
Dec 11 04:59:37 np0005555140 nova_compute[187006]: </domain>
Dec 11 04:59:37 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.875 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Preparing to wait for external event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.876 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.876 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.876 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.877 187010 DEBUG nova.virt.libvirt.vif [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T09:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095390827',display_name='tempest-TestNetworkBasicOps-server-2095390827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095390827',id=10,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBJTRNPYFwJw3SQZr4ROevEM5duu4m1YhuxEhBAqhG2Y11PAdG6TG7FM7xOCWCJLl0z/aX/t188s0dkX9sSWjb0QAj8w0vF2wJi3qc/ps4dx4WKZO0Q4XWGyhVjy86v35g==',key_name='tempest-TestNetworkBasicOps-294158070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-elyzwnhg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T09:59:30Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e861be47-625e-4830-ba0d-e45eced42fe6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.877 187010 DEBUG nova.network.os_vif_util [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.878 187010 DEBUG nova.network.os_vif_util [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.879 187010 DEBUG os_vif [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.879 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.880 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.880 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.883 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.883 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eb0e5af-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.884 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2eb0e5af-40, col_values=(('external_ids', {'iface-id': '2eb0e5af-4006-431a-890b-a8cc811acafe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:31:62', 'vm-uuid': 'e861be47-625e-4830-ba0d-e45eced42fe6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.886 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.888 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 04:59:37 np0005555140 NetworkManager[55531]: <info>  [1765447177.8879] manager: (tap2eb0e5af-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.894 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.895 187010 INFO os_vif [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40')#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.941 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.942 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.942 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:5b:31:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 04:59:37 np0005555140 nova_compute[187006]: 2025-12-11 09:59:37.943 187010 INFO nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Using config drive#033[00m
Dec 11 04:59:38 np0005555140 nova_compute[187006]: 2025-12-11 09:59:38.785 187010 INFO nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Creating config drive at /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.config#033[00m
Dec 11 04:59:38 np0005555140 nova_compute[187006]: 2025-12-11 09:59:38.791 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdy_hv2ok execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 04:59:38 np0005555140 nova_compute[187006]: 2025-12-11 09:59:38.917 187010 DEBUG oslo_concurrency.processutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdy_hv2ok" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 04:59:38 np0005555140 kernel: tap2eb0e5af-40: entered promiscuous mode
Dec 11 04:59:38 np0005555140 NetworkManager[55531]: <info>  [1765447178.9796] manager: (tap2eb0e5af-40): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec 11 04:59:38 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:38Z|00141|binding|INFO|Claiming lport 2eb0e5af-4006-431a-890b-a8cc811acafe for this chassis.
Dec 11 04:59:38 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:38Z|00142|binding|INFO|2eb0e5af-4006-431a-890b-a8cc811acafe: Claiming fa:16:3e:5b:31:62 10.100.0.14
Dec 11 04:59:38 np0005555140 nova_compute[187006]: 2025-12-11 09:59:38.981 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:38 np0005555140 nova_compute[187006]: 2025-12-11 09:59:38.984 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:38.997 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:31:62 10.100.0.14'], port_security=['fa:16:3e:5b:31:62 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a24d1a8b-da5b-4cf2-9cb3-2ca5e82221bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=720c4386-3ed1-4071-970a-4630c6cc256f, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=2eb0e5af-4006-431a-890b-a8cc811acafe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:38.999 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 2eb0e5af-4006-431a-890b-a8cc811acafe in datapath 56f13783-e1c0-4b79-861d-1f0a15faf4d1 bound to our chassis#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.000 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 56f13783-e1c0-4b79-861d-1f0a15faf4d1#033[00m
Dec 11 04:59:39 np0005555140 systemd-udevd[217867]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.017 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1089c431-506f-4787-b44f-0ff9aefd5705]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.018 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap56f13783-e1 in ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.020 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap56f13783-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.020 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf8028b-b229-49af-8279-ef5d656e2f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.021 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f561c61d-443e-4f13-8dad-5877b9cee5bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 systemd-machined[153398]: New machine qemu-10-instance-0000000a.
Dec 11 04:59:39 np0005555140 NetworkManager[55531]: <info>  [1765447179.0342] device (tap2eb0e5af-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 04:59:39 np0005555140 NetworkManager[55531]: <info>  [1765447179.0351] device (tap2eb0e5af-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.034 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[e858b3d6-d7b4-43ca-8bc5-f05686eed54e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.045 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Dec 11 04:59:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:39Z|00143|binding|INFO|Setting lport 2eb0e5af-4006-431a-890b-a8cc811acafe ovn-installed in OVS
Dec 11 04:59:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:39Z|00144|binding|INFO|Setting lport 2eb0e5af-4006-431a-890b-a8cc811acafe up in Southbound
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.053 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.058 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0f28ea9f-bdb2-41ff-a004-67cefde000d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.086 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[db4a54d9-d92f-4fa1-9c12-5128c6ddc618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.091 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e0986004-b275-42c5-9a79-0715dd25be5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 NetworkManager[55531]: <info>  [1765447179.0918] manager: (tap56f13783-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.119 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[4727ee43-b042-447b-8be2-93f5ab54d135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.121 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[80faa60c-846b-49c0-92de-a6a504f0197f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 NetworkManager[55531]: <info>  [1765447179.1417] device (tap56f13783-e0): carrier: link connected
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.147 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[99cf568d-b291-477e-92ff-ea10f703c367]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.165 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d12ce000-513e-4bf1-a17c-2a76710bf5ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56f13783-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:af:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353771, 'reachable_time': 38163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217900, 'error': None, 'target': 'ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.179 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[20674517-8e70-44cc-9c35-860d3c095790]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:af01'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353771, 'tstamp': 353771}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217901, 'error': None, 'target': 'ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.197 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f9882757-e91f-42aa-9c3d-b23723f690b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap56f13783-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:af:01'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353771, 'reachable_time': 38163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217902, 'error': None, 'target': 'ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.226 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[54b9caef-33ca-411e-9bb1-56dedd06c9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.273 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[682eee56-efe6-432b-8c0c-df0e55c2ff32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.274 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56f13783-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.274 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.275 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56f13783-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:39 np0005555140 kernel: tap56f13783-e0: entered promiscuous mode
Dec 11 04:59:39 np0005555140 NetworkManager[55531]: <info>  [1765447179.2781] manager: (tap56f13783-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.276 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.278 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.279 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap56f13783-e0, col_values=(('external_ids', {'iface-id': 'd463ff8b-56a0-4fbf-bc0e-85a2feb7bc86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 04:59:39 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:39Z|00145|binding|INFO|Releasing lport d463ff8b-56a0-4fbf-bc0e-85a2feb7bc86 from this chassis (sb_readonly=0)
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.280 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.281 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.282 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/56f13783-e1c0-4b79-861d-1f0a15faf4d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/56f13783-e1c0-4b79-861d-1f0a15faf4d1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.283 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a941d69f-78ec-4789-8da6-0bead40030c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.283 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-56f13783-e1c0-4b79-861d-1f0a15faf4d1
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/56f13783-e1c0-4b79-861d-1f0a15faf4d1.pid.haproxy
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 04:59:39 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 04:59:39 np0005555140 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 56f13783-e1c0-4b79-861d-1f0a15faf4d1
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 04:59:39 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:39.284 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'env', 'PROCESS_TAG=haproxy-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/56f13783-e1c0-4b79-861d-1f0a15faf4d1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.298 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.409 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447179.408489, e861be47-625e-4830-ba0d-e45eced42fe6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.409 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] VM Started (Lifecycle Event)#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.431 187010 DEBUG nova.compute.manager [req-4fca0757-b72a-4a98-b8b6-595fa223960e req-0fd75717-d36f-4a4b-9bfe-28511d088db5 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.432 187010 DEBUG oslo_concurrency.lockutils [req-4fca0757-b72a-4a98-b8b6-595fa223960e req-0fd75717-d36f-4a4b-9bfe-28511d088db5 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.432 187010 DEBUG oslo_concurrency.lockutils [req-4fca0757-b72a-4a98-b8b6-595fa223960e req-0fd75717-d36f-4a4b-9bfe-28511d088db5 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.433 187010 DEBUG oslo_concurrency.lockutils [req-4fca0757-b72a-4a98-b8b6-595fa223960e req-0fd75717-d36f-4a4b-9bfe-28511d088db5 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.433 187010 DEBUG nova.compute.manager [req-4fca0757-b72a-4a98-b8b6-595fa223960e req-0fd75717-d36f-4a4b-9bfe-28511d088db5 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Processing event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.434 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.450 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.456 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.459 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.463 187010 INFO nova.virt.libvirt.driver [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Instance spawned successfully.#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.463 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.478 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.479 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447179.4086413, e861be47-625e-4830-ba0d-e45eced42fe6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.479 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] VM Paused (Lifecycle Event)#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.487 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.487 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.488 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.488 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.489 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.490 187010 DEBUG nova.virt.libvirt.driver [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.495 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.499 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447179.4424043, e861be47-625e-4830-ba0d-e45eced42fe6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.499 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] VM Resumed (Lifecycle Event)#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.529 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.534 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.543 187010 INFO nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Took 8.49 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.544 187010 DEBUG nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.558 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.600 187010 INFO nova.compute.manager [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Took 8.98 seconds to build instance.#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.619 187010 DEBUG oslo_concurrency.lockutils [None req-3108d186-29d9-4b2c-87d8-61d7d7c3cfa3 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:39 np0005555140 podman[217942]: 2025-12-11 09:59:39.674411953 +0000 UTC m=+0.057313713 container create 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 04:59:39 np0005555140 systemd[1]: Started libpod-conmon-8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe.scope.
Dec 11 04:59:39 np0005555140 podman[217942]: 2025-12-11 09:59:39.641790188 +0000 UTC m=+0.024692008 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 04:59:39 np0005555140 systemd[1]: Started libcrun container.
Dec 11 04:59:39 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14b3946e98963c328e790fb25859d7e16cfe9e72292724ecbaa0488eea12b162/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 04:59:39 np0005555140 podman[217942]: 2025-12-11 09:59:39.755712762 +0000 UTC m=+0.138614552 container init 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 04:59:39 np0005555140 podman[217942]: 2025-12-11 09:59:39.762140606 +0000 UTC m=+0.145042366 container start 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 04:59:39 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [NOTICE]   (217971) : New worker (217980) forked
Dec 11 04:59:39 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [NOTICE]   (217971) : Loading success.
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.790 187010 DEBUG nova.network.neutron [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updated VIF entry in instance network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.790 187010 DEBUG nova.network.neutron [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:59:39 np0005555140 nova_compute[187006]: 2025-12-11 09:59:39.808 187010 DEBUG oslo_concurrency.lockutils [req-41d5e90c-3a54-488b-aaab-eecbfce17510 req-612325b5-089b-4d7c-8e99-fdeca6bc1d89 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:59:39 np0005555140 podman[217959]: 2025-12-11 09:59:39.815483274 +0000 UTC m=+0.087342123 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Dec 11 04:59:39 np0005555140 podman[217990]: 2025-12-11 09:59:39.87468313 +0000 UTC m=+0.056129649 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.508 187010 DEBUG nova.compute.manager [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.508 187010 DEBUG oslo_concurrency.lockutils [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.509 187010 DEBUG oslo_concurrency.lockutils [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.509 187010 DEBUG oslo_concurrency.lockutils [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.510 187010 DEBUG nova.compute.manager [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] No waiting events found dispatching network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.510 187010 WARNING nova.compute.manager [req-43dd2b00-82bf-4592-9c7c-537ec632dc80 req-a4a708b0-dc22-4425-9dde-0ce3ff25fe1b b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received unexpected event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe for instance with vm_state active and task_state None.#033[00m
Dec 11 04:59:41 np0005555140 nova_compute[187006]: 2025-12-11 09:59:41.839 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:42 np0005555140 nova_compute[187006]: 2025-12-11 09:59:42.888 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:44Z|00146|binding|INFO|Releasing lport d463ff8b-56a0-4fbf-bc0e-85a2feb7bc86 from this chassis (sb_readonly=0)
Dec 11 04:59:44 np0005555140 NetworkManager[55531]: <info>  [1765447184.1791] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Dec 11 04:59:44 np0005555140 NetworkManager[55531]: <info>  [1765447184.1801] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.182 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:44 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:44Z|00147|binding|INFO|Releasing lport d463ff8b-56a0-4fbf-bc0e-85a2feb7bc86 from this chassis (sb_readonly=0)
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.214 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.216 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.711 187010 DEBUG nova.compute.manager [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.712 187010 DEBUG nova.compute.manager [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing instance network info cache due to event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.712 187010 DEBUG oslo_concurrency.lockutils [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.712 187010 DEBUG oslo_concurrency.lockutils [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:59:44 np0005555140 nova_compute[187006]: 2025-12-11 09:59:44.713 187010 DEBUG nova.network.neutron [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 04:59:45 np0005555140 podman[218012]: 2025-12-11 09:59:45.669619087 +0000 UTC m=+0.048167790 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 04:59:45 np0005555140 nova_compute[187006]: 2025-12-11 09:59:45.788 187010 DEBUG nova.network.neutron [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updated VIF entry in instance network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 04:59:45 np0005555140 nova_compute[187006]: 2025-12-11 09:59:45.788 187010 DEBUG nova.network.neutron [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:59:46 np0005555140 nova_compute[187006]: 2025-12-11 09:59:46.636 187010 DEBUG oslo_concurrency.lockutils [req-18447468-2114-4b66-9037-b91ffe6f33c2 req-dc58a831-9de8-479d-9808-317381340b00 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:59:46 np0005555140 nova_compute[187006]: 2025-12-11 09:59:46.842 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:47 np0005555140 nova_compute[187006]: 2025-12-11 09:59:47.891 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:48.627 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.167 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'name': 'tempest-TestNetworkBasicOps-server-2095390827', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'user_id': '277eaa28c80b403abb371276e6721821', 'hostId': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.179 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.180 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6df67885-0e25-4474-803f-9a107108ec9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.168573', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21e8a1cc-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': '446e7f5bf5605573833c5b407adeb58144f7f55d1c1548157b77e2aa900868da'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.168573', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21e8aece-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': '642f564bab7ceebd41859687c015756aee1c5c1afb93d50ecb55e001b7082564'}]}, 'timestamp': '2025-12-11 09:59:50.180368', '_unique_id': 'f43e344d52f64ffa851df75a22543060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.205 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.requests volume: 232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.206 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa536bc-740c-41b7-b737-c9d7035c9da8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 232, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.183039', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21eca312-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '115cc9e16ec49d44a3c8899ba1b9336c1b3f6e070ad3e6a8d62ddfb7c1c3a491'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.183039', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21ecae0c-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '0e89c1b79eaeea8e751444da00657e2085b7e8d6fd5f01d63b97624b3ff2fa45'}]}, 'timestamp': '2025-12-11 09:59:50.206528', '_unique_id': 'b3d9459016b1499e99b62dea5b19d08d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.211 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e861be47-625e-4830-ba0d-e45eced42fe6 / tap2eb0e5af-40 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.211 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e94fd77b-8849-4052-9dc4-6ef05c480ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.208106', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21ed864c-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '085072af467837a853c39c853c5214a9015bc0f87f4ac2c0f0f2f21d4de1b400'}]}, 'timestamp': '2025-12-11 09:59:50.212156', '_unique_id': 'f67ccaddd36f4f63a624bf9f0b1059fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.213 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.227 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/cpu volume: 9880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bff2f244-8a42-4490-b13f-0611971a1c74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9880000000, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'timestamp': '2025-12-11T09:59:50.214062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '21efff9e-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.852905932, 'message_signature': '4bba0e8b279f47583adeb70448abb00be026bc197ffb03aee6b8c7fb2664d569'}]}, 'timestamp': '2025-12-11 09:59:50.228373', '_unique_id': 'a276525f604c4149a470d92e5f0731c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>]
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.bytes volume: 28404736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.230 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1a6727f-8388-4177-b91b-c3cee2938aa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28404736, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.230630', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f064e8-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '8aaef0f0356c2f467878140d7b5203a24b830a52133b684f953df88d5b755752'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.230630', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f06e16-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '749e46607446cc981e27842d820f4bb8fef7c58ffc0d0530c958181cfa3ddb1f'}]}, 'timestamp': '2025-12-11 09:59:50.231089', '_unique_id': 'f2ddbebce88e4aada569b2f632405ca0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.231 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.232 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.232 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1362223-c94a-49df-a551-4d3facd17fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.232486', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f0ae80-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': 'd00e4f00f756fdfda19942caff034701822970fcfe22f8f1ea4eb339f71b549e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.232486', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f0b9fc-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': '31c16440bda010293e57ca0772db157be4d12904f48512291b904a3c55d7b175'}]}, 'timestamp': '2025-12-11 09:59:50.233064', '_unique_id': '084a933830624ef4b8514102cd44425b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.233 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.234 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e5fecb5-848b-46d1-ab6c-4a3e11d7d30b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.234512', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f0fdf4-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '3e059a34487c351c3f9481a6d79ec4c6fa2084fe3f67b1af73b495b70afc79a6'}]}, 'timestamp': '2025-12-11 09:59:50.234824', '_unique_id': '1edc02fecf4e426b80b9776d752338c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.235 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '188e2413-4ddf-49aa-bf86-2d67317dadab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'timestamp': '2025-12-11T09:59:50.236227', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '21f140d4-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.852905932, 'message_signature': '71efa5fbe4a70efcdea133716c579e21dde0caaa0e9b159e7813e4a546f61d7f'}]}, 'timestamp': '2025-12-11 09:59:50.236505', '_unique_id': '791a2e595b224bbebe81faf517c0b58e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a32fff16-2bd6-4917-bd2e-569a19fb3614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.238177', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f18cce-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '44317bfb3aee5999ec1707c43f377dd8aa5b17b661c9819c127a065dcfdc4857'}]}, 'timestamp': '2025-12-11 09:59:50.238443', '_unique_id': 'cc38cfde0dec492c9c4d05f04b51b3e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.238 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.239 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e40ee1c2-fd6b-4d53-b1c9-6ef4ceefe79f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.239652', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f1c6b2-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '755922c3a0ce6f9c330e933e1cbff85391ed4ab75cc2e512298e04c8d6812b43'}]}, 'timestamp': '2025-12-11 09:59:50.239985', '_unique_id': 'd1915dbcc38e4b16815fea3e3e569606'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.240 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>]
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.requests volume: 982 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.241 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '911ab03a-9684-4b6a-a239-9049344ae9f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 982, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.241539', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f20fdc-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '3ebf6161f707bb6865da4ac327c5b3f75fa4da1f0f2623c38d739d72952bb440'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.241539', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f2186a-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '421c5b6339a1e953a2dfc67d19c1ae7f7d8a918301bc377ee0999d7b552dbd3a'}]}, 'timestamp': '2025-12-11 09:59:50.241999', '_unique_id': 'bc1e6defc9ee47609e3b10767f79998f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.242 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.outgoing.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d7b3a49-e4ab-4f51-93dd-a03dd0c07b79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.243139', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f24e7a-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '35bf7b20db30629a306d1c9d721a51bfa3fb9ea32ca0048aab29bee12da40a52'}]}, 'timestamp': '2025-12-11 09:59:50.243444', '_unique_id': '7f390801d2df4e0b8863c300b03029df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.243 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.244 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.244 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c5b8a0b-11a7-470c-b2d4-56da73e7f49d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.244626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f288c2-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': 'e69f66601f6735448a7cead394279c41d4b710b0142ae6509e24842901f695c5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.244626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f291fa-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.793790859, 'message_signature': '809f35a745b9c9e4d2e4eab51224f42b60ef646ca8215bad99df098a969da232'}]}, 'timestamp': '2025-12-11 09:59:50.245112', '_unique_id': '20e1112539764ee7860e8bfbb44ee0f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.245 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.246 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.latency volume: 3220918027 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.246 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e34e4ad-29f9-4078-b0d4-9642c51bf070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3220918027, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.246255', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f2c6d4-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '6110d80eac0b35d70b430b974122c7173a40dd404fad91538023b154b1a7536e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.246255', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f2cec2-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': 'd1675b5e24d02393b9bbe7153b75acdd18085fae9da78db7092226ed7e5d8aad'}]}, 'timestamp': '2025-12-11 09:59:50.246665', '_unique_id': '7f6a240dc66d44c09bae88b36d81157f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.247 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.248 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>]
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.248 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '682eb59c-1c19-4f08-8cd8-3fecb9a7c183', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.248245', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f31634-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '8d42953ac8f442c957ff513439248e551798184091e61ae1df3d7af45550d9e1'}]}, 'timestamp': '2025-12-11 09:59:50.248548', '_unique_id': '2b39d97c84b943229ab40d3c0702e336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.249 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09cfe85e-4b71-4f9c-a116-4d357f225bec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.249696', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f34dca-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '0f919701215a2e33fab857bb011941cf8f9bfa8754c1d6a566957ee7d83a7a32'}]}, 'timestamp': '2025-12-11 09:59:50.249953', '_unique_id': 'ad0641781e21417eb820d2d52c484ec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.250 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.outgoing.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '681cde8d-5d4e-466e-ac5f-0eed96f6b4e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.251096', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f38434-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '556a62c3cd9ef481c3f939f1e052c4b7e8952ba6026ec5b4deac47f0311130fa'}]}, 'timestamp': '2025-12-11 09:59:50.251328', '_unique_id': '260c86a67faf4ea6affc869b718e9794'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.251 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.252 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.252 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2095390827>]
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.252 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee81cb00-4641-4a75-9bb9-e0b5146739a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.252832', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f3ca02-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '089efde23d068ab937e6e9e3895ab403a0a849e3263e5aa80e2d3d4768afd975'}]}, 'timestamp': '2025-12-11 09:59:50.253148', '_unique_id': '15936573975d43ab9494e2d786b2b622'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.253 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.254 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.254 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '576a0914-bff8-41ca-b5ea-39eb48432a11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.254513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f40a30-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '51fb4f8f32cf3176e80fac65030d59a4a16ff01e700ba9c0e28029d835c7911a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.254513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f4155c-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '57e75b37ec52cc6cdebd21c64692644a22533416bda2d4a72edcd7cd0dc81401'}]}, 'timestamp': '2025-12-11 09:59:50.255059', '_unique_id': 'fafe709f22f34b388a2babba56738943'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.255 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.256 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/network.incoming.bytes volume: 916 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f96e4ebb-b19a-4a23-b9ec-afb79f509123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 916, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'instance-0000000a-e861be47-625e-4830-ba0d-e45eced42fe6-tap2eb0e5af-40', 'timestamp': '2025-12-11T09:59:50.256539', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'tap2eb0e5af-40', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:31:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2eb0e5af-40'}, 'message_id': '21f459ea-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.833313271, 'message_signature': '5f5758d44f6810d0ec31e4a7517b9b38dbc4abc969e4cc5fe33ebb20f97ad806'}]}, 'timestamp': '2025-12-11 09:59:50.256843', '_unique_id': '2cbecb3eca214bab8f36d504bad972e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.257 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.258 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.latency volume: 170462212 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.258 12 DEBUG ceilometer.compute.pollsters [-] e861be47-625e-4830-ba0d-e45eced42fe6/disk.device.read.latency volume: 31833882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8c64123-aca3-4638-98dc-499f1fa3fa60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170462212, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-vda', 'timestamp': '2025-12-11T09:59:50.258396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '21f4a2b0-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': '5819e382bd4c3e443ff12a3ace93ce1154de1e28c80370cd1e199b7b6e453f88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31833882, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_name': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_name': None, 'resource_id': 'e861be47-625e-4830-ba0d-e45eced42fe6-sda', 'timestamp': '2025-12-11T09:59:50.258396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2095390827', 'name': 'instance-0000000a', 'instance_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'instance_type': 'm1.nano', 'host': '31956de6f5bbc4087f79b156eb33d14a8c317b90dc096ce934f10f79', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '8ceb5bb7-cd53-4ae6-a352-a5023850ca5b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9e66a2ab-a034-4869-91a9-a90f37915272'}, 'image_ref': '9e66a2ab-a034-4869-91a9-a90f37915272', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '21f4ad82-d678-11f0-9d2b-fa163e7e2991', 'monotonic_time': 3548.808253733, 'message_signature': 'f06ab3a877c96a99caf36a54abf0ead341dbcf81dee2673017dd84e876dabd17'}]}, 'timestamp': '2025-12-11 09:59:50.258990', '_unique_id': '4991607face9483aa4b76291552a4dc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 04:59:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 09:59:50.259 12 ERROR oslo_messaging.notify.messaging 
Dec 11 04:59:50 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:50Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:31:62 10.100.0.14
Dec 11 04:59:50 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:50Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:31:62 10.100.0.14
Dec 11 04:59:50 np0005555140 podman[218050]: 2025-12-11 09:59:50.673163992 +0000 UTC m=+0.050884928 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 04:59:51 np0005555140 nova_compute[187006]: 2025-12-11 09:59:51.844 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:52 np0005555140 podman[218075]: 2025-12-11 09:59:52.69177615 +0000 UTC m=+0.064481628 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Dec 11 04:59:52 np0005555140 podman[218074]: 2025-12-11 09:59:52.70120311 +0000 UTC m=+0.079446727 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 04:59:52 np0005555140 nova_compute[187006]: 2025-12-11 09:59:52.996 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:53 np0005555140 nova_compute[187006]: 2025-12-11 09:59:53.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:55 np0005555140 nova_compute[187006]: 2025-12-11 09:59:55.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:55 np0005555140 nova_compute[187006]: 2025-12-11 09:59:55.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 04:59:55 np0005555140 nova_compute[187006]: 2025-12-11 09:59:55.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 04:59:56 np0005555140 nova_compute[187006]: 2025-12-11 09:59:56.640 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 04:59:56 np0005555140 nova_compute[187006]: 2025-12-11 09:59:56.641 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 04:59:56 np0005555140 nova_compute[187006]: 2025-12-11 09:59:56.641 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 04:59:56 np0005555140 nova_compute[187006]: 2025-12-11 09:59:56.641 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e861be47-625e-4830-ba0d-e45eced42fe6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 04:59:56 np0005555140 nova_compute[187006]: 2025-12-11 09:59:56.848 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:57 np0005555140 nova_compute[187006]: 2025-12-11 09:59:57.030 187010 INFO nova.compute.manager [None req-7a8464b4-baca-4b3f-bff8-046b24e3e513 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Get console output#033[00m
Dec 11 04:59:57 np0005555140 nova_compute[187006]: 2025-12-11 09:59:57.036 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 04:59:57 np0005555140 nova_compute[187006]: 2025-12-11 09:59:57.999 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:58 np0005555140 nova_compute[187006]: 2025-12-11 09:59:58.320 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 04:59:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:58.321 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 04:59:58 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 09:59:58.323 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 04:59:59 np0005555140 ovn_controller[95438]: 2025-12-11T09:59:59Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:31:62 10.100.0.14
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.790 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.825 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.826 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.826 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.857 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.858 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.858 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.859 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 04:59:59 np0005555140 nova_compute[187006]: 2025-12-11 09:59:59.945 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.001 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.002 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.078 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.231 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.232 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5550MB free_disk=73.29970169067383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.233 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:00 np0005555140 nova_compute[187006]: 2025-12-11 10:00:00.233 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.071 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance e861be47-625e-4830-ba0d-e45eced42fe6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.072 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.072 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.121 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.618 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.655 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.656 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:01 np0005555140 nova_compute[187006]: 2025-12-11 10:00:01.851 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.324 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.656 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:02 np0005555140 podman[218130]: 2025-12-11 10:00:02.675556679 +0000 UTC m=+0.051202658 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.813 187010 DEBUG nova.compute.manager [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.813 187010 DEBUG nova.compute.manager [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing instance network info cache due to event network-changed-2eb0e5af-4006-431a-890b-a8cc811acafe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.814 187010 DEBUG oslo_concurrency.lockutils [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.814 187010 DEBUG oslo_concurrency.lockutils [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.814 187010 DEBUG nova.network.neutron [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Refreshing network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.880 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.881 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.881 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.881 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.881 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.882 187010 INFO nova.compute.manager [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Terminating instance#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.883 187010 DEBUG nova.compute.manager [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 05:00:02 np0005555140 kernel: tap2eb0e5af-40 (unregistering): left promiscuous mode
Dec 11 05:00:02 np0005555140 NetworkManager[55531]: <info>  [1765447202.9100] device (tap2eb0e5af-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 05:00:02 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:02Z|00148|binding|INFO|Releasing lport 2eb0e5af-4006-431a-890b-a8cc811acafe from this chassis (sb_readonly=0)
Dec 11 05:00:02 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:02Z|00149|binding|INFO|Setting lport 2eb0e5af-4006-431a-890b-a8cc811acafe down in Southbound
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.918 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:02 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:02Z|00150|binding|INFO|Removing iface tap2eb0e5af-40 ovn-installed in OVS
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.929 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:31:62 10.100.0.14'], port_security=['fa:16:3e:5b:31:62 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e861be47-625e-4830-ba0d-e45eced42fe6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a24d1a8b-da5b-4cf2-9cb3-2ca5e82221bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=720c4386-3ed1-4071-970a-4630c6cc256f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=2eb0e5af-4006-431a-890b-a8cc811acafe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.930 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 2eb0e5af-4006-431a-890b-a8cc811acafe in datapath 56f13783-e1c0-4b79-861d-1f0a15faf4d1 unbound from our chassis#033[00m
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.931 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 56f13783-e1c0-4b79-861d-1f0a15faf4d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.932 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[1090f5de-f867-4f4f-bb9c-3928a2290300]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:02 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:02.932 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1 namespace which is not needed anymore#033[00m
Dec 11 05:00:02 np0005555140 nova_compute[187006]: 2025-12-11 10:00:02.936 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:02 np0005555140 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 11 05:00:02 np0005555140 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 12.374s CPU time.
Dec 11 05:00:02 np0005555140 systemd-machined[153398]: Machine qemu-10-instance-0000000a terminated.
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.001 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [NOTICE]   (217971) : haproxy version is 2.8.14-c23fe91
Dec 11 05:00:03 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [NOTICE]   (217971) : path to executable is /usr/sbin/haproxy
Dec 11 05:00:03 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [WARNING]  (217971) : Exiting Master process...
Dec 11 05:00:03 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [ALERT]    (217971) : Current worker (217980) exited with code 143 (Terminated)
Dec 11 05:00:03 np0005555140 neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1[217957]: [WARNING]  (217971) : All workers exited. Exiting... (0)
Dec 11 05:00:03 np0005555140 systemd[1]: libpod-8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe.scope: Deactivated successfully.
Dec 11 05:00:03 np0005555140 podman[218179]: 2025-12-11 10:00:03.065636173 +0000 UTC m=+0.048838940 container died 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 05:00:03 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe-userdata-shm.mount: Deactivated successfully.
Dec 11 05:00:03 np0005555140 systemd[1]: var-lib-containers-storage-overlay-14b3946e98963c328e790fb25859d7e16cfe9e72292724ecbaa0488eea12b162-merged.mount: Deactivated successfully.
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.101 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.106 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 podman[218179]: 2025-12-11 10:00:03.117934002 +0000 UTC m=+0.101136759 container cleanup 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 05:00:03 np0005555140 systemd[1]: libpod-conmon-8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe.scope: Deactivated successfully.
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.147 187010 INFO nova.virt.libvirt.driver [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Instance destroyed successfully.#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.148 187010 DEBUG nova.objects.instance [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid e861be47-625e-4830-ba0d-e45eced42fe6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.162 187010 DEBUG nova.virt.libvirt.vif [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T09:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095390827',display_name='tempest-TestNetworkBasicOps-server-2095390827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095390827',id=10,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBJTRNPYFwJw3SQZr4ROevEM5duu4m1YhuxEhBAqhG2Y11PAdG6TG7FM7xOCWCJLl0z/aX/t188s0dkX9sSWjb0QAj8w0vF2wJi3qc/ps4dx4WKZO0Q4XWGyhVjy86v35g==',key_name='tempest-TestNetworkBasicOps-294158070',keypairs=<?>,launch_index=0,launched_at=2025-12-11T09:59:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-elyzwnhg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T09:59:39Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e861be47-625e-4830-ba0d-e45eced42fe6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.162 187010 DEBUG nova.network.os_vif_util [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.163 187010 DEBUG nova.network.os_vif_util [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.163 187010 DEBUG os_vif [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.166 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.166 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eb0e5af-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.168 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.170 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.173 187010 INFO os_vif [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:31:62,bridge_name='br-int',has_traffic_filtering=True,id=2eb0e5af-4006-431a-890b-a8cc811acafe,network=Network(56f13783-e1c0-4b79-861d-1f0a15faf4d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2eb0e5af-40')#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.174 187010 INFO nova.virt.libvirt.driver [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Deleting instance files /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6_del#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.175 187010 INFO nova.virt.libvirt.driver [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Deletion of /var/lib/nova/instances/e861be47-625e-4830-ba0d-e45eced42fe6_del complete#033[00m
Dec 11 05:00:03 np0005555140 podman[218219]: 2025-12-11 10:00:03.198961793 +0000 UTC m=+0.054616736 container remove 8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.205 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e262bf3c-e5d0-42c2-8346-b2def54d33f0]: (4, ('Thu Dec 11 10:00:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1 (8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe)\n8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe\nThu Dec 11 10:00:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1 (8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe)\n8ce6620f91732cf4934430daa39240e71751cfef75a161ec2ea29f3e8d98eefe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.208 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[62739fa2-8dbd-4639-90e0-08787be4c8a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.209 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56f13783-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.210 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 kernel: tap56f13783-e0: left promiscuous mode
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.223 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.228 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b20b65fa-803f-4e60-8c5e-136a0d6f0b15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.248 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a56cd686-263b-4eb4-a2a7-bb72a0a69856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.250 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[090029d0-6c92-49c4-bc67-32d8483f62f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.261 187010 INFO nova.compute.manager [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.262 187010 DEBUG oslo.service.loopingcall [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.262 187010 DEBUG nova.compute.manager [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 05:00:03 np0005555140 nova_compute[187006]: 2025-12-11 10:00:03.263 187010 DEBUG nova.network.neutron [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.266 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[ac7273d9-9163-40e2-8605-4ba6de8f1149]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353765, 'reachable_time': 44240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218237, 'error': None, 'target': 'ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:03 np0005555140 systemd[1]: run-netns-ovnmeta\x2d56f13783\x2de1c0\x2d4b79\x2d861d\x2d1f0a15faf4d1.mount: Deactivated successfully.
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.270 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-56f13783-e1c0-4b79-861d-1f0a15faf4d1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 05:00:03 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:03.270 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[71af5886-6a42-4095-9ae6-21dd338cd57f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.346 187010 DEBUG nova.network.neutron [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updated VIF entry in instance network info cache for port 2eb0e5af-4006-431a-890b-a8cc811acafe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.347 187010 DEBUG nova.network.neutron [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [{"id": "2eb0e5af-4006-431a-890b-a8cc811acafe", "address": "fa:16:3e:5b:31:62", "network": {"id": "56f13783-e1c0-4b79-861d-1f0a15faf4d1", "bridge": "br-int", "label": "tempest-network-smoke--1668513687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2eb0e5af-40", "ovs_interfaceid": "2eb0e5af-4006-431a-890b-a8cc811acafe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.390 187010 DEBUG oslo_concurrency.lockutils [req-6be3ed88-7124-4182-815f-ac39f55a6a5e req-1b4b0ed3-d2df-49e0-a7af-b5f51c665cc7 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e861be47-625e-4830-ba0d-e45eced42fe6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.409 187010 DEBUG nova.network.neutron [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.426 187010 INFO nova.compute.manager [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Took 1.16 seconds to deallocate network for instance.#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.731 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.732 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.798 187010 DEBUG nova.compute.provider_tree [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.826 187010 DEBUG nova.scheduler.client.report [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.874 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.956 187010 INFO nova.scheduler.client.report [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance e861be47-625e-4830-ba0d-e45eced42fe6#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.983 187010 DEBUG nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-vif-unplugged-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.984 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.985 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.985 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.986 187010 DEBUG nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] No waiting events found dispatching network-vif-unplugged-2eb0e5af-4006-431a-890b-a8cc811acafe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.986 187010 WARNING nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received unexpected event network-vif-unplugged-2eb0e5af-4006-431a-890b-a8cc811acafe for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.987 187010 DEBUG nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.987 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.987 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.988 187010 DEBUG oslo_concurrency.lockutils [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.988 187010 DEBUG nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] No waiting events found dispatching network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.989 187010 WARNING nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received unexpected event network-vif-plugged-2eb0e5af-4006-431a-890b-a8cc811acafe for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:00:04 np0005555140 nova_compute[187006]: 2025-12-11 10:00:04.989 187010 DEBUG nova.compute.manager [req-3dbcf996-da93-4490-8927-e245d4145ce3 req-60689703-ecef-4c30-979c-9fd0b4edb11a b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Received event network-vif-deleted-2eb0e5af-4006-431a-890b-a8cc811acafe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:05 np0005555140 nova_compute[187006]: 2025-12-11 10:00:05.151 187010 DEBUG oslo_concurrency.lockutils [None req-ccb2acf3-d5cd-4c14-8200-5fff6861c143 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e861be47-625e-4830-ba0d-e45eced42fe6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:05 np0005555140 nova_compute[187006]: 2025-12-11 10:00:05.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:06 np0005555140 nova_compute[187006]: 2025-12-11 10:00:06.853 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:08 np0005555140 nova_compute[187006]: 2025-12-11 10:00:08.170 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:10 np0005555140 podman[218238]: 2025-12-11 10:00:10.695682791 +0000 UTC m=+0.059943499 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 05:00:10 np0005555140 podman[218239]: 2025-12-11 10:00:10.7099731 +0000 UTC m=+0.066541827 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 11 05:00:11 np0005555140 nova_compute[187006]: 2025-12-11 10:00:11.854 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:13 np0005555140 nova_compute[187006]: 2025-12-11 10:00:13.173 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:14 np0005555140 nova_compute[187006]: 2025-12-11 10:00:14.014 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:14 np0005555140 nova_compute[187006]: 2025-12-11 10:00:14.085 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:16 np0005555140 podman[218282]: 2025-12-11 10:00:16.685289103 +0000 UTC m=+0.053610907 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 11 05:00:16 np0005555140 nova_compute[187006]: 2025-12-11 10:00:16.858 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:18 np0005555140 nova_compute[187006]: 2025-12-11 10:00:18.145 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447203.143396, e861be47-625e-4830-ba0d-e45eced42fe6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:18 np0005555140 nova_compute[187006]: 2025-12-11 10:00:18.146 187010 INFO nova.compute.manager [-] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] VM Stopped (Lifecycle Event)#033[00m
Dec 11 05:00:18 np0005555140 nova_compute[187006]: 2025-12-11 10:00:18.169 187010 DEBUG nova.compute.manager [None req-96f4da92-2aa2-4115-8456-759dd5b5d365 - - - - - -] [instance: e861be47-625e-4830-ba0d-e45eced42fe6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:18 np0005555140 nova_compute[187006]: 2025-12-11 10:00:18.176 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:21 np0005555140 podman[218301]: 2025-12-11 10:00:21.672617944 +0000 UTC m=+0.052182476 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:00:21 np0005555140 nova_compute[187006]: 2025-12-11 10:00:21.858 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:23 np0005555140 nova_compute[187006]: 2025-12-11 10:00:23.178 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:23 np0005555140 podman[218326]: 2025-12-11 10:00:23.698705695 +0000 UTC m=+0.061523173 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, release=1755695350, managed_by=edpm_ansible)
Dec 11 05:00:23 np0005555140 podman[218325]: 2025-12-11 10:00:23.726372458 +0000 UTC m=+0.099839181 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 05:00:26 np0005555140 nova_compute[187006]: 2025-12-11 10:00:26.860 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:28 np0005555140 nova_compute[187006]: 2025-12-11 10:00:28.181 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.215 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.215 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.245 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.330 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.331 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.339 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.340 187010 INFO nova.compute.claims [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.439 187010 DEBUG nova.compute.provider_tree [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.455 187010 DEBUG nova.scheduler.client.report [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.473 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.474 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.515 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.515 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.534 187010 INFO nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.555 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.701 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.702 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.703 187010 INFO nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Creating image(s)#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.703 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.704 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.704 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.720 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.778 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.779 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.780 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.791 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.865 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.881 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.881 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.899 187010 DEBUG nova.policy [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.919 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.920 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.920 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.975 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.976 187010 DEBUG nova.virt.disk.api [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 05:00:31 np0005555140 nova_compute[187006]: 2025-12-11 10:00:31.976 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.031 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.033 187010 DEBUG nova.virt.disk.api [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.034 187010 DEBUG nova.objects.instance [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid e7fc787e-783c-48a9-947b-cb7d8a412e60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.059 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.060 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Ensure instance console log exists: /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.061 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.061 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.062 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:32 np0005555140 nova_compute[187006]: 2025-12-11 10:00:32.563 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Successfully created port: af78ff97-5ade-4b55-ab3d-05399536509f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.185 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.602 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Successfully updated port: af78ff97-5ade-4b55-ab3d-05399536509f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.619 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.620 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.620 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 05:00:33 np0005555140 podman[218385]: 2025-12-11 10:00:33.680965835 +0000 UTC m=+0.056785877 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.682 187010 DEBUG nova.compute.manager [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.682 187010 DEBUG nova.compute.manager [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing instance network info cache due to event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.683 187010 DEBUG oslo_concurrency.lockutils [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:33 np0005555140 nova_compute[187006]: 2025-12-11 10:00:33.866 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.488 187010 DEBUG nova.network.neutron [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.517 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.517 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Instance network_info: |[{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.518 187010 DEBUG oslo_concurrency.lockutils [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.518 187010 DEBUG nova.network.neutron [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.521 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Start _get_guest_xml network_info=[{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.526 187010 WARNING nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.531 187010 DEBUG nova.virt.libvirt.host [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.532 187010 DEBUG nova.virt.libvirt.host [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.541 187010 DEBUG nova.virt.libvirt.host [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.542 187010 DEBUG nova.virt.libvirt.host [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.542 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.543 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.543 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.544 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.544 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.544 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.544 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.545 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.545 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.545 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.546 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.546 187010 DEBUG nova.virt.hardware [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.551 187010 DEBUG nova.virt.libvirt.vif [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:00:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1717863907',display_name='tempest-TestNetworkBasicOps-server-1717863907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1717863907',id=11,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJMBSIhZ4EVil7AXrHPe22R5NehWPSWuc1SalCvSn9cIqHbM1CDGz5cixIH9uspQSK0YbE0eVVIj4M3PRbSdv5mbOULzw4alwWqm1ii8fiAxnjVOpwfUfrb2MBIDBIX+nw==',key_name='tempest-TestNetworkBasicOps-1529837035',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-6j1yq68b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:00:31Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e7fc787e-783c-48a9-947b-cb7d8a412e60,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.552 187010 DEBUG nova.network.os_vif_util [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.552 187010 DEBUG nova.network.os_vif_util [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.553 187010 DEBUG nova.objects.instance [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid e7fc787e-783c-48a9-947b-cb7d8a412e60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.568 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] End _get_guest_xml xml=<domain type="kvm">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <uuid>e7fc787e-783c-48a9-947b-cb7d8a412e60</uuid>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <name>instance-0000000b</name>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1717863907</nova:name>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 10:00:34</nova:creationTime>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        <nova:port uuid="af78ff97-5ade-4b55-ab3d-05399536509f">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <system>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="serial">e7fc787e-783c-48a9-947b-cb7d8a412e60</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="uuid">e7fc787e-783c-48a9-947b-cb7d8a412e60</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </system>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <os>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </os>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <features>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </features>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </clock>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  <devices>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.config"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:23:91:c4"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <target dev="tapaf78ff97-5a"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </interface>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/console.log" append="off"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </serial>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <video>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </video>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </rng>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 05:00:34 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 05:00:34 np0005555140 nova_compute[187006]:  </devices>
Dec 11 05:00:34 np0005555140 nova_compute[187006]: </domain>
Dec 11 05:00:34 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.570 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Preparing to wait for external event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.571 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.571 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.571 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.572 187010 DEBUG nova.virt.libvirt.vif [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:00:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1717863907',display_name='tempest-TestNetworkBasicOps-server-1717863907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1717863907',id=11,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJMBSIhZ4EVil7AXrHPe22R5NehWPSWuc1SalCvSn9cIqHbM1CDGz5cixIH9uspQSK0YbE0eVVIj4M3PRbSdv5mbOULzw4alwWqm1ii8fiAxnjVOpwfUfrb2MBIDBIX+nw==',key_name='tempest-TestNetworkBasicOps-1529837035',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-6j1yq68b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:00:31Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e7fc787e-783c-48a9-947b-cb7d8a412e60,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.572 187010 DEBUG nova.network.os_vif_util [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.573 187010 DEBUG nova.network.os_vif_util [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.574 187010 DEBUG os_vif [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.574 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.575 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.575 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.579 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.579 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf78ff97-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.580 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf78ff97-5a, col_values=(('external_ids', {'iface-id': 'af78ff97-5ade-4b55-ab3d-05399536509f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:91:c4', 'vm-uuid': 'e7fc787e-783c-48a9-947b-cb7d8a412e60'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.582 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:34 np0005555140 NetworkManager[55531]: <info>  [1765447234.5839] manager: (tapaf78ff97-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.584 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.588 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.589 187010 INFO os_vif [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a')#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.756 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.756 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.757 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:23:91:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 05:00:34 np0005555140 nova_compute[187006]: 2025-12-11 10:00:34.757 187010 INFO nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Using config drive#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.042 187010 INFO nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Creating config drive at /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.config#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.047 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy6q8p1pl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.177 187010 DEBUG oslo_concurrency.processutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy6q8p1pl" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:35 np0005555140 kernel: tapaf78ff97-5a: entered promiscuous mode
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.2483] manager: (tapaf78ff97-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Dec 11 05:00:35 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:35Z|00151|binding|INFO|Claiming lport af78ff97-5ade-4b55-ab3d-05399536509f for this chassis.
Dec 11 05:00:35 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:35Z|00152|binding|INFO|af78ff97-5ade-4b55-ab3d-05399536509f: Claiming fa:16:3e:23:91:c4 10.100.0.5
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.250 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.253 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.263 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:91:c4 10.100.0.5'], port_security=['fa:16:3e:23:91:c4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e7fc787e-783c-48a9-947b-cb7d8a412e60', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8d836b6d-b2d8-4d99-ba32-610089cbc32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c13e7c47-09f9-4a17-81fa-f5bc1627478a, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=af78ff97-5ade-4b55-ab3d-05399536509f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.264 104288 INFO neutron.agent.ovn.metadata.agent [-] Port af78ff97-5ade-4b55-ab3d-05399536509f in datapath 5a47ac59-4586-4282-be2b-545a8e2d8aa8 bound to our chassis#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.265 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5a47ac59-4586-4282-be2b-545a8e2d8aa8#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.279 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b0202bca-247f-4665-9b4e-825ab2e83d17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.281 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5a47ac59-41 in ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.283 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5a47ac59-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.283 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[811e60b6-9768-4131-82ae-d312075d5816]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.285 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[11339e62-5c35-45c7-aaa4-a1bda4db0fff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 systemd-udevd[218429]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.298 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[b0be5392-d715-4f8d-8578-7e33aa1a41dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 systemd-machined[153398]: New machine qemu-11-instance-0000000b.
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.306 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.3090] device (tapaf78ff97-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.3097] device (tapaf78ff97-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 05:00:35 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:35Z|00153|binding|INFO|Setting lport af78ff97-5ade-4b55-ab3d-05399536509f ovn-installed in OVS
Dec 11 05:00:35 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:35Z|00154|binding|INFO|Setting lport af78ff97-5ade-4b55-ab3d-05399536509f up in Southbound
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.311 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.317 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4efc5eac-cf58-423e-a531-8efd05ce85d4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.343 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[eba6d155-0171-4c3e-afed-589d8450a7cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 systemd-udevd[218434]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.3513] manager: (tap5a47ac59-40): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.350 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[305c07ce-0149-474f-acdf-c129a8fbbfcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.383 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[53c02255-e4e7-4579-9ca4-54dd8e370eb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.386 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[6d61a218-404f-48c1-8c56-c5960af4c562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.4116] device (tap5a47ac59-40): carrier: link connected
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.417 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[6cecd591-3f94-4507-b8c3-12c675c7ad7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.435 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8838b5-3ac8-4dd5-b02c-85c1e279f0a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5a47ac59-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:3d:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359398, 'reachable_time': 29685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218462, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.449 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[483f36ce-fdd6-4f3c-81cf-559777c7ffca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:3d0d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359398, 'tstamp': 359398}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218465, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.463 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3b75b4a1-6e2c-4ffc-a8c8-a3383fa91c78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5a47ac59-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:3d:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359398, 'reachable_time': 29685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218470, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.505 187010 DEBUG nova.compute.manager [req-d8d62bb5-5123-43af-bcc6-39c19a48111b req-7769cabe-23e5-43a9-92f6-b64d9e411126 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.506 187010 DEBUG oslo_concurrency.lockutils [req-d8d62bb5-5123-43af-bcc6-39c19a48111b req-7769cabe-23e5-43a9-92f6-b64d9e411126 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.506 187010 DEBUG oslo_concurrency.lockutils [req-d8d62bb5-5123-43af-bcc6-39c19a48111b req-7769cabe-23e5-43a9-92f6-b64d9e411126 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.506 187010 DEBUG oslo_concurrency.lockutils [req-d8d62bb5-5123-43af-bcc6-39c19a48111b req-7769cabe-23e5-43a9-92f6-b64d9e411126 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.507 187010 DEBUG nova.compute.manager [req-d8d62bb5-5123-43af-bcc6-39c19a48111b req-7769cabe-23e5-43a9-92f6-b64d9e411126 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Processing event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.508 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[42ad8f52-8847-4b61-acd4-b174b4617435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.534 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447235.5336423, e7fc787e-783c-48a9-947b-cb7d8a412e60 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.535 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] VM Started (Lifecycle Event)#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.538 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.542 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.547 187010 INFO nova.virt.libvirt.driver [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Instance spawned successfully.#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.547 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.566 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.573 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.576 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.577 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.577 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.577 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.578 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.578 187010 DEBUG nova.virt.libvirt.driver [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.588 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[dcccad25-1e44-4e02-bc8b-af2bb4d5a506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.590 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a47ac59-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.590 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.590 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a47ac59-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.592 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 kernel: tap5a47ac59-40: entered promiscuous mode
Dec 11 05:00:35 np0005555140 NetworkManager[55531]: <info>  [1765447235.5937] manager: (tap5a47ac59-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.595 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.595 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5a47ac59-40, col_values=(('external_ids', {'iface-id': 'a4f6738c-9a28-4a6d-8841-3c9e211c20b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.596 187010 DEBUG nova.network.neutron [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated VIF entry in instance network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:00:35 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:35Z|00155|binding|INFO|Releasing lport a4f6738c-9a28-4a6d-8841-3c9e211c20b8 from this chassis (sb_readonly=0)
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.597 187010 DEBUG nova.network.neutron [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.598 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.613 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.615 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.615 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5a47ac59-4586-4282-be2b-545a8e2d8aa8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5a47ac59-4586-4282-be2b-545a8e2d8aa8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.616 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447235.5337584, e7fc787e-783c-48a9-947b-cb7d8a412e60 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.616 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] VM Paused (Lifecycle Event)#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.616 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ca6082-7e97-466f-bbe3-e2942be70b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.617 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-5a47ac59-4586-4282-be2b-545a8e2d8aa8
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/5a47ac59-4586-4282-be2b-545a8e2d8aa8.pid.haproxy
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 5a47ac59-4586-4282-be2b-545a8e2d8aa8
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 05:00:35 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:35.618 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'env', 'PROCESS_TAG=haproxy-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5a47ac59-4586-4282-be2b-545a8e2d8aa8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.622 187010 DEBUG oslo_concurrency.lockutils [req-d341896a-6387-4779-89ab-c23ac1b32e64 req-b8994db2-d463-4841-8f0d-b1b9a28d8281 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.648 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.651 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447235.5412748, e7fc787e-783c-48a9-947b-cb7d8a412e60 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.652 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] VM Resumed (Lifecycle Event)#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.659 187010 INFO nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Took 3.96 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.660 187010 DEBUG nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.695 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.697 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.729 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.739 187010 INFO nova.compute.manager [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Took 4.44 seconds to build instance.#033[00m
Dec 11 05:00:35 np0005555140 nova_compute[187006]: 2025-12-11 10:00:35.758 187010 DEBUG oslo_concurrency.lockutils [None req-f3fd38f3-7041-4042-aa0a-d9c45db84389 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:35 np0005555140 podman[218503]: 2025-12-11 10:00:35.987560612 +0000 UTC m=+0.056992904 container create f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 05:00:36 np0005555140 systemd[1]: Started libpod-conmon-f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877.scope.
Dec 11 05:00:36 np0005555140 systemd[1]: Started libcrun container.
Dec 11 05:00:36 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5d77b7bd26f2cc9c7e5195876da67d49b12b28004376f4a9fbe1df5d670c4ff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 05:00:36 np0005555140 podman[218503]: 2025-12-11 10:00:35.959025805 +0000 UTC m=+0.028458127 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 05:00:36 np0005555140 podman[218503]: 2025-12-11 10:00:36.055950901 +0000 UTC m=+0.125383193 container init f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 11 05:00:36 np0005555140 podman[218503]: 2025-12-11 10:00:36.061807259 +0000 UTC m=+0.131239541 container start f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 05:00:36 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [NOTICE]   (218522) : New worker (218524) forked
Dec 11 05:00:36 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [NOTICE]   (218522) : Loading success.
Dec 11 05:00:36 np0005555140 nova_compute[187006]: 2025-12-11 10:00:36.867 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.560 187010 DEBUG nova.compute.manager [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.561 187010 DEBUG oslo_concurrency.lockutils [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.561 187010 DEBUG oslo_concurrency.lockutils [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.562 187010 DEBUG oslo_concurrency.lockutils [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.562 187010 DEBUG nova.compute.manager [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:00:37 np0005555140 nova_compute[187006]: 2025-12-11 10:00:37.562 187010 WARNING nova.compute.manager [req-30f30bd7-296d-4bae-bb4f-7777c669a6d3 req-4e50924d-aaf6-4c52-9dde-cf5a24984750 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state active and task_state None.#033[00m
Dec 11 05:00:39 np0005555140 nova_compute[187006]: 2025-12-11 10:00:39.584 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:40 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:40Z|00156|binding|INFO|Releasing lport a4f6738c-9a28-4a6d-8841-3c9e211c20b8 from this chassis (sb_readonly=0)
Dec 11 05:00:40 np0005555140 NetworkManager[55531]: <info>  [1765447240.0368] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Dec 11 05:00:40 np0005555140 NetworkManager[55531]: <info>  [1765447240.0374] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.040 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:40 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:40Z|00157|binding|INFO|Releasing lport a4f6738c-9a28-4a6d-8841-3c9e211c20b8 from this chassis (sb_readonly=0)
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.071 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.433 187010 DEBUG nova.compute.manager [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.434 187010 DEBUG nova.compute.manager [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing instance network info cache due to event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.434 187010 DEBUG oslo_concurrency.lockutils [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.434 187010 DEBUG oslo_concurrency.lockutils [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:40 np0005555140 nova_compute[187006]: 2025-12-11 10:00:40.435 187010 DEBUG nova.network.neutron [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:00:41 np0005555140 podman[218534]: 2025-12-11 10:00:41.689709451 +0000 UTC m=+0.066661220 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:00:41 np0005555140 podman[218535]: 2025-12-11 10:00:41.71270617 +0000 UTC m=+0.087612401 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 05:00:41 np0005555140 nova_compute[187006]: 2025-12-11 10:00:41.869 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:44 np0005555140 nova_compute[187006]: 2025-12-11 10:00:44.586 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:44 np0005555140 nova_compute[187006]: 2025-12-11 10:00:44.887 187010 DEBUG nova.network.neutron [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated VIF entry in instance network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:00:44 np0005555140 nova_compute[187006]: 2025-12-11 10:00:44.888 187010 DEBUG nova.network.neutron [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:44 np0005555140 nova_compute[187006]: 2025-12-11 10:00:44.908 187010 DEBUG oslo_concurrency.lockutils [req-2babfde9-24b8-489b-91a5-9a967e808dc1 req-68c82bf8-45b2-4f16-b5c2-a77ce1e00542 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:46 np0005555140 nova_compute[187006]: 2025-12-11 10:00:46.872 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:46 np0005555140 podman[218599]: 2025-12-11 10:00:46.988271707 +0000 UTC m=+0.078655394 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 05:00:47 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:47Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:91:c4 10.100.0.5
Dec 11 05:00:47 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:47Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:91:c4 10.100.0.5
Dec 11 05:00:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:48.626 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:48.627 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:48.628 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:49 np0005555140 nova_compute[187006]: 2025-12-11 10:00:49.588 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.206 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.206 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.222 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.305 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.306 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.316 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.317 187010 INFO nova.compute.claims [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.446 187010 DEBUG nova.compute.provider_tree [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.613 187010 DEBUG nova.scheduler.client.report [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.640 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.640 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.681 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.682 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.698 187010 INFO nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.717 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.874 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.875 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.876 187010 INFO nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Creating image(s)#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.876 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.877 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.877 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.890 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.892 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.915 187010 DEBUG nova.policy [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.967 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.968 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.969 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:51 np0005555140 nova_compute[187006]: 2025-12-11 10:00:51.979 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.027 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.028 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.067 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.069 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.069 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.141 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.142 187010 DEBUG nova.virt.disk.api [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.142 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.196 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.197 187010 DEBUG nova.virt.disk.api [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.197 187010 DEBUG nova.objects.instance [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 15b95a61-2ec8-42a7-92b4-e0094e84b686 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.211 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.212 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Ensure instance console log exists: /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.213 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.213 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.214 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:52 np0005555140 nova_compute[187006]: 2025-12-11 10:00:52.646 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Successfully created port: 466e5560-fb15-4c0a-9532-8bcc93782074 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 05:00:52 np0005555140 podman[218633]: 2025-12-11 10:00:52.71431936 +0000 UTC m=+0.074188516 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.638 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Successfully updated port: 466e5560-fb15-4c0a-9532-8bcc93782074 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.656 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.657 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.657 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.748 187010 DEBUG nova.compute.manager [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.749 187010 DEBUG nova.compute.manager [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing instance network info cache due to event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.749 187010 DEBUG oslo_concurrency.lockutils [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:53 np0005555140 nova_compute[187006]: 2025-12-11 10:00:53.920 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.479 187010 DEBUG nova.network.neutron [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updating instance_info_cache with network_info: [{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.511 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.512 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Instance network_info: |[{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.513 187010 DEBUG oslo_concurrency.lockutils [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.513 187010 DEBUG nova.network.neutron [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.516 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Start _get_guest_xml network_info=[{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.521 187010 WARNING nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.527 187010 DEBUG nova.virt.libvirt.host [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.528 187010 DEBUG nova.virt.libvirt.host [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.537 187010 DEBUG nova.virt.libvirt.host [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.538 187010 DEBUG nova.virt.libvirt.host [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.539 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.539 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.539 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.540 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.540 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.540 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.540 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.541 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.541 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.541 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.541 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.542 187010 DEBUG nova.virt.hardware [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.546 187010 DEBUG nova.virt.libvirt.vif [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:00:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1466142008',display_name='tempest-TestNetworkBasicOps-server-1466142008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1466142008',id=12,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPc3r89Se+TgcVpWw+RaJrqaJjoC3ikewxurYN1Ijew2tKdnN6TqwMi7UYL9ZEk0yjHlfFmP+q3SMn2mrbBLHV26V6aBso00pty24+PQlF1IcKn5AbLOEGKwqQnvdorvEg==',key_name='tempest-TestNetworkBasicOps-1350177732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-58qfon3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:00:51Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=15b95a61-2ec8-42a7-92b4-e0094e84b686,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.547 187010 DEBUG nova.network.os_vif_util [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.548 187010 DEBUG nova.network.os_vif_util [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.549 187010 DEBUG nova.objects.instance [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 15b95a61-2ec8-42a7-92b4-e0094e84b686 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.567 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] End _get_guest_xml xml=<domain type="kvm">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <uuid>15b95a61-2ec8-42a7-92b4-e0094e84b686</uuid>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <name>instance-0000000c</name>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-1466142008</nova:name>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 10:00:54</nova:creationTime>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        <nova:port uuid="466e5560-fb15-4c0a-9532-8bcc93782074">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <system>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="serial">15b95a61-2ec8-42a7-92b4-e0094e84b686</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="uuid">15b95a61-2ec8-42a7-92b4-e0094e84b686</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </system>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <os>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </os>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <features>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </features>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </clock>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  <devices>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.config"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:6c:5b:75"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <target dev="tap466e5560-fb"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </interface>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/console.log" append="off"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </serial>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <video>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </video>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </rng>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 05:00:54 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 05:00:54 np0005555140 nova_compute[187006]:  </devices>
Dec 11 05:00:54 np0005555140 nova_compute[187006]: </domain>
Dec 11 05:00:54 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.568 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Preparing to wait for external event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.569 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.569 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.569 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.570 187010 DEBUG nova.virt.libvirt.vif [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:00:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1466142008',display_name='tempest-TestNetworkBasicOps-server-1466142008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1466142008',id=12,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPc3r89Se+TgcVpWw+RaJrqaJjoC3ikewxurYN1Ijew2tKdnN6TqwMi7UYL9ZEk0yjHlfFmP+q3SMn2mrbBLHV26V6aBso00pty24+PQlF1IcKn5AbLOEGKwqQnvdorvEg==',key_name='tempest-TestNetworkBasicOps-1350177732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-58qfon3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:00:51Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=15b95a61-2ec8-42a7-92b4-e0094e84b686,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.570 187010 DEBUG nova.network.os_vif_util [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.571 187010 DEBUG nova.network.os_vif_util [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.571 187010 DEBUG os_vif [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.572 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.573 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.573 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.576 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.577 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap466e5560-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.577 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap466e5560-fb, col_values=(('external_ids', {'iface-id': '466e5560-fb15-4c0a-9532-8bcc93782074', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:5b:75', 'vm-uuid': '15b95a61-2ec8-42a7-92b4-e0094e84b686'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.578 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:54 np0005555140 NetworkManager[55531]: <info>  [1765447254.5794] manager: (tap466e5560-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.582 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.586 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.587 187010 INFO os_vif [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb')#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.645 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.645 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.646 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:6c:5b:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 05:00:54 np0005555140 nova_compute[187006]: 2025-12-11 10:00:54.646 187010 INFO nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Using config drive#033[00m
Dec 11 05:00:54 np0005555140 podman[218663]: 2025-12-11 10:00:54.733960987 +0000 UTC m=+0.085541422 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Dec 11 05:00:54 np0005555140 podman[218662]: 2025-12-11 10:00:54.756921055 +0000 UTC m=+0.119316019 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.700 187010 INFO nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Creating config drive at /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.config#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.711 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwm33az7f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.842 187010 DEBUG oslo_concurrency.processutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwm33az7f" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.878 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:56 np0005555140 kernel: tap466e5560-fb: entered promiscuous mode
Dec 11 05:00:56 np0005555140 NetworkManager[55531]: <info>  [1765447256.9128] manager: (tap466e5560-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.914 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:56Z|00158|binding|INFO|Claiming lport 466e5560-fb15-4c0a-9532-8bcc93782074 for this chassis.
Dec 11 05:00:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:56Z|00159|binding|INFO|466e5560-fb15-4c0a-9532-8bcc93782074: Claiming fa:16:3e:6c:5b:75 10.100.0.14
Dec 11 05:00:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:56Z|00160|binding|INFO|Setting lport 466e5560-fb15-4c0a-9532-8bcc93782074 ovn-installed in OVS
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.927 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:56 np0005555140 nova_compute[187006]: 2025-12-11 10:00:56.930 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:56 np0005555140 systemd-udevd[218725]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 05:00:56 np0005555140 systemd-machined[153398]: New machine qemu-12-instance-0000000c.
Dec 11 05:00:56 np0005555140 NetworkManager[55531]: <info>  [1765447256.9556] device (tap466e5560-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 05:00:56 np0005555140 NetworkManager[55531]: <info>  [1765447256.9566] device (tap466e5560-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 05:00:56 np0005555140 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.079 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec 11 05:00:57 np0005555140 ovn_controller[95438]: 2025-12-11T10:00:57Z|00161|binding|INFO|Setting lport 466e5560-fb15-4c0a-9532-8bcc93782074 up in Southbound
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.113 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:5b:75 10.100.0.14'], port_security=['fa:16:3e:6c:5b:75 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '15b95a61-2ec8-42a7-92b4-e0094e84b686', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b029d25-1258-4056-9cbd-0493dd1cf004', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c13e7c47-09f9-4a17-81fa-f5bc1627478a, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=466e5560-fb15-4c0a-9532-8bcc93782074) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.114 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 466e5560-fb15-4c0a-9532-8bcc93782074 in datapath 5a47ac59-4586-4282-be2b-545a8e2d8aa8 bound to our chassis#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.116 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5a47ac59-4586-4282-be2b-545a8e2d8aa8#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.133 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e5811a-208d-4a8d-a3e0-b97c2d478eed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.167 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[145da696-7fee-4e64-b584-2fd60dd84c3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.172 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[0695716a-3aab-4db7-b5af-759c7e085a88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.204 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[338ccd09-4e34-44ef-b84d-ecf3badd500d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.228 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[68dfda8d-48e2-41b2-b8f9-0c422cfa284e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5a47ac59-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:3d:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359398, 'reachable_time': 29685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218739, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.248 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7a620f9b-daf4-4be0-9aef-baa05fc3f00e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5a47ac59-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359411, 'tstamp': 359411}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218740, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5a47ac59-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359415, 'tstamp': 359415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218740, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.249 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a47ac59-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.251 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.253 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a47ac59-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.254 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.254 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5a47ac59-40, col_values=(('external_ids', {'iface-id': 'a4f6738c-9a28-4a6d-8841-3c9e211c20b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:00:57 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:00:57.254 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.563 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447257.5630302, 15b95a61-2ec8-42a7-92b4-e0094e84b686 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.564 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] VM Started (Lifecycle Event)#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.584 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.589 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447257.5632784, 15b95a61-2ec8-42a7-92b4-e0094e84b686 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.589 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] VM Paused (Lifecycle Event)#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.617 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.622 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.858 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.966 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.966 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.967 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 05:00:57 np0005555140 nova_compute[187006]: 2025-12-11 10:00:57.967 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e7fc787e-783c-48a9-947b-cb7d8a412e60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.454 187010 DEBUG nova.compute.manager [req-07aa2b63-f761-48b8-8834-faea84352269 req-d6a4b592-b596-4e65-b9f8-ec50744e2013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.455 187010 DEBUG oslo_concurrency.lockutils [req-07aa2b63-f761-48b8-8834-faea84352269 req-d6a4b592-b596-4e65-b9f8-ec50744e2013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.455 187010 DEBUG oslo_concurrency.lockutils [req-07aa2b63-f761-48b8-8834-faea84352269 req-d6a4b592-b596-4e65-b9f8-ec50744e2013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.456 187010 DEBUG oslo_concurrency.lockutils [req-07aa2b63-f761-48b8-8834-faea84352269 req-d6a4b592-b596-4e65-b9f8-ec50744e2013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.456 187010 DEBUG nova.compute.manager [req-07aa2b63-f761-48b8-8834-faea84352269 req-d6a4b592-b596-4e65-b9f8-ec50744e2013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Processing event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.458 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.462 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447258.4621446, 15b95a61-2ec8-42a7-92b4-e0094e84b686 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.463 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] VM Resumed (Lifecycle Event)#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.465 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.471 187010 INFO nova.virt.libvirt.driver [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Instance spawned successfully.#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.472 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.481 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.486 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.493 187010 DEBUG nova.network.neutron [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updated VIF entry in instance network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.494 187010 DEBUG nova.network.neutron [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updating instance_info_cache with network_info: [{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.507 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.508 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.509 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.510 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.511 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.512 187010 DEBUG nova.virt.libvirt.driver [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.520 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.520 187010 DEBUG oslo_concurrency.lockutils [req-6a123f9a-39ec-47d3-9b64-7e2ec9281c06 req-bfeac73f-0a86-4735-91b3-f121c995839e b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.571 187010 INFO nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Took 6.70 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.572 187010 DEBUG nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.936 187010 INFO nova.compute.manager [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Took 7.67 seconds to build instance.#033[00m
Dec 11 05:00:58 np0005555140 nova_compute[187006]: 2025-12-11 10:00:58.952 187010 DEBUG oslo_concurrency.lockutils [None req-0662e243-01a2-447f-b820-be8161d78a25 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.746s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.073 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.089 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.090 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.091 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.092 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:00:59 np0005555140 nova_compute[187006]: 2025-12-11 10:00:59.581 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.547 187010 DEBUG nova.compute.manager [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.548 187010 DEBUG oslo_concurrency.lockutils [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.549 187010 DEBUG oslo_concurrency.lockutils [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.549 187010 DEBUG oslo_concurrency.lockutils [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.550 187010 DEBUG nova.compute.manager [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] No waiting events found dispatching network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.551 187010 WARNING nova.compute.manager [req-eef9c68f-92cd-40bb-8703-5ab376fbfb4b req-646f21e5-15e1-40fe-9264-99ee69f23883 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received unexpected event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.856 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.857 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.857 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:00 np0005555140 nova_compute[187006]: 2025-12-11 10:01:00.857 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.150 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:01.155 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:01:01 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:01.156 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.167 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.211 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.212 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.275 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.283 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.337 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.339 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.414 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.560 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.562 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5449MB free_disk=73.29892349243164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.562 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.563 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.639 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance e7fc787e-783c-48a9-947b-cb7d8a412e60 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.640 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 15b95a61-2ec8-42a7-92b4-e0094e84b686 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.640 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.641 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.693 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.713 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.742 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.742 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.794 187010 DEBUG nova.compute.manager [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.795 187010 DEBUG nova.compute.manager [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing instance network info cache due to event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.795 187010 DEBUG oslo_concurrency.lockutils [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.796 187010 DEBUG oslo_concurrency.lockutils [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.796 187010 DEBUG nova.network.neutron [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:01 np0005555140 nova_compute[187006]: 2025-12-11 10:01:01.879 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:04 np0005555140 nova_compute[187006]: 2025-12-11 10:01:04.583 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:04 np0005555140 podman[218772]: 2025-12-11 10:01:04.684514808 +0000 UTC m=+0.057405355 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:01:04 np0005555140 nova_compute[187006]: 2025-12-11 10:01:04.736 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:06 np0005555140 nova_compute[187006]: 2025-12-11 10:01:06.193 187010 DEBUG nova.network.neutron [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updated VIF entry in instance network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:06 np0005555140 nova_compute[187006]: 2025-12-11 10:01:06.193 187010 DEBUG nova.network.neutron [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updating instance_info_cache with network_info: [{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:06 np0005555140 nova_compute[187006]: 2025-12-11 10:01:06.210 187010 DEBUG oslo_concurrency.lockutils [req-0ff00aca-3aa9-456e-8358-af11705691cd req-b12f9b0a-b0ac-46dc-a26f-9ab53cb3394d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:06 np0005555140 nova_compute[187006]: 2025-12-11 10:01:06.882 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:07 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:07.159 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:07 np0005555140 nova_compute[187006]: 2025-12-11 10:01:07.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:09 np0005555140 nova_compute[187006]: 2025-12-11 10:01:09.586 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:10 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:10Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:5b:75 10.100.0.14
Dec 11 05:01:10 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:10Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:5b:75 10.100.0.14
Dec 11 05:01:11 np0005555140 nova_compute[187006]: 2025-12-11 10:01:11.884 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:12 np0005555140 podman[218812]: 2025-12-11 10:01:12.700733697 +0000 UTC m=+0.066309251 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 05:01:12 np0005555140 podman[218813]: 2025-12-11 10:01:12.741834304 +0000 UTC m=+0.097464943 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202)
Dec 11 05:01:14 np0005555140 nova_compute[187006]: 2025-12-11 10:01:14.589 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:16 np0005555140 nova_compute[187006]: 2025-12-11 10:01:16.887 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:17 np0005555140 podman[218853]: 2025-12-11 10:01:17.68492116 +0000 UTC m=+0.052623478 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:01:18 np0005555140 nova_compute[187006]: 2025-12-11 10:01:18.081 187010 INFO nova.compute.manager [None req-38baae49-b114-4e1f-ae82-9e1c1aa6078f 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Get console output#033[00m
Dec 11 05:01:18 np0005555140 nova_compute[187006]: 2025-12-11 10:01:18.088 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.221 187010 DEBUG nova.compute.manager [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.222 187010 DEBUG nova.compute.manager [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing instance network info cache due to event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.222 187010 DEBUG oslo_concurrency.lockutils [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.223 187010 DEBUG oslo_concurrency.lockutils [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.223 187010 DEBUG nova.network.neutron [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.289 187010 DEBUG nova.compute.manager [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.290 187010 DEBUG oslo_concurrency.lockutils [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.290 187010 DEBUG oslo_concurrency.lockutils [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.291 187010 DEBUG oslo_concurrency.lockutils [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.291 187010 DEBUG nova.compute.manager [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.291 187010 WARNING nova.compute.manager [req-fff45d7a-2977-412d-921e-5659c9969a42 req-50bffc03-88ba-48e9-8598-959311d1262d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:19 np0005555140 nova_compute[187006]: 2025-12-11 10:01:19.591 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:20 np0005555140 nova_compute[187006]: 2025-12-11 10:01:20.285 187010 INFO nova.compute.manager [None req-9c7933fe-4630-48bb-b0ee-6038d2bac9fd 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Get console output#033[00m
Dec 11 05:01:20 np0005555140 nova_compute[187006]: 2025-12-11 10:01:20.293 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:01:20 np0005555140 nova_compute[187006]: 2025-12-11 10:01:20.919 187010 DEBUG nova.network.neutron [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated VIF entry in instance network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:20 np0005555140 nova_compute[187006]: 2025-12-11 10:01:20.921 187010 DEBUG nova.network.neutron [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:20 np0005555140 nova_compute[187006]: 2025-12-11 10:01:20.947 187010 DEBUG oslo_concurrency.lockutils [req-1d290891-5470-431b-a3f0-2882da614ae9 req-9c186bd8-0daf-4a8f-abba-19eaf74e5a84 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.500 187010 DEBUG nova.compute.manager [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.500 187010 DEBUG oslo_concurrency.lockutils [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.500 187010 DEBUG oslo_concurrency.lockutils [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.501 187010 DEBUG oslo_concurrency.lockutils [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.501 187010 DEBUG nova.compute.manager [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.501 187010 WARNING nova.compute.manager [req-a794940f-56a0-4ccd-b839-3c9d272bfc7a req-7fa9fc5d-672b-4381-91b7-9be2743719ef b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:21 np0005555140 nova_compute[187006]: 2025-12-11 10:01:21.889 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:22 np0005555140 nova_compute[187006]: 2025-12-11 10:01:22.271 187010 INFO nova.compute.manager [None req-af212b15-8d34-47ee-988f-344dea3333d2 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Get console output#033[00m
Dec 11 05:01:22 np0005555140 nova_compute[187006]: 2025-12-11 10:01:22.276 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.585 187010 DEBUG nova.compute.manager [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.585 187010 DEBUG nova.compute.manager [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing instance network info cache due to event network-changed-466e5560-fb15-4c0a-9532-8bcc93782074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.585 187010 DEBUG oslo_concurrency.lockutils [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.586 187010 DEBUG oslo_concurrency.lockutils [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.586 187010 DEBUG nova.network.neutron [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Refreshing network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.649 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.649 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.650 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.650 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.650 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.651 187010 INFO nova.compute.manager [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Terminating instance#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.652 187010 DEBUG nova.compute.manager [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.657 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.658 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing instance network info cache due to event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.658 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.658 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.658 187010 DEBUG nova.network.neutron [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:23 np0005555140 kernel: tap466e5560-fb (unregistering): left promiscuous mode
Dec 11 05:01:23 np0005555140 NetworkManager[55531]: <info>  [1765447283.6690] device (tap466e5560-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 05:01:23 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:23Z|00162|binding|INFO|Releasing lport 466e5560-fb15-4c0a-9532-8bcc93782074 from this chassis (sb_readonly=0)
Dec 11 05:01:23 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:23Z|00163|binding|INFO|Setting lport 466e5560-fb15-4c0a-9532-8bcc93782074 down in Southbound
Dec 11 05:01:23 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:23Z|00164|binding|INFO|Removing iface tap466e5560-fb ovn-installed in OVS
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.682 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:5b:75 10.100.0.14'], port_security=['fa:16:3e:6c:5b:75 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '15b95a61-2ec8-42a7-92b4-e0094e84b686', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b029d25-1258-4056-9cbd-0493dd1cf004', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c13e7c47-09f9-4a17-81fa-f5bc1627478a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=466e5560-fb15-4c0a-9532-8bcc93782074) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.684 104288 INFO neutron.agent.ovn.metadata.agent [-] Port 466e5560-fb15-4c0a-9532-8bcc93782074 in datapath 5a47ac59-4586-4282-be2b-545a8e2d8aa8 unbound from our chassis#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.684 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5a47ac59-4586-4282-be2b-545a8e2d8aa8#033[00m
Dec 11 05:01:23 np0005555140 podman[218872]: 2025-12-11 10:01:23.693142688 +0000 UTC m=+0.073362093 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.707 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[a30c9418-6668-4831-8a7f-3baa3efd37bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 11 05:01:23 np0005555140 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 12.532s CPU time.
Dec 11 05:01:23 np0005555140 systemd-machined[153398]: Machine qemu-12-instance-0000000c terminated.
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.733 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[86479568-cb15-4aa6-9394-44be7d371192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.737 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[4924960b-ea4f-444b-b91b-5aac8cea76c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.763 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4957cb-6b5c-4fee-b811-19b1d0f39aa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.780 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[da64f5a6-7a9e-41e2-81a7-a930a31b61ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5a47ac59-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:3d:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359398, 'reachable_time': 29685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218909, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.784 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.795 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6c4458-a31a-487a-9164-160fbdda0492]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5a47ac59-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359411, 'tstamp': 359411}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218910, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap5a47ac59-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 359415, 'tstamp': 359415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218910, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.797 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a47ac59-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.798 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.803 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.805 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5a47ac59-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.805 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.805 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5a47ac59-40, col_values=(('external_ids', {'iface-id': 'a4f6738c-9a28-4a6d-8841-3c9e211c20b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:23 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:23.806 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.874 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.878 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.909 187010 INFO nova.virt.libvirt.driver [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Instance destroyed successfully.#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.909 187010 DEBUG nova.objects.instance [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 15b95a61-2ec8-42a7-92b4-e0094e84b686 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.921 187010 DEBUG nova.virt.libvirt.vif [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T10:00:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1466142008',display_name='tempest-TestNetworkBasicOps-server-1466142008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1466142008',id=12,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPc3r89Se+TgcVpWw+RaJrqaJjoC3ikewxurYN1Ijew2tKdnN6TqwMi7UYL9ZEk0yjHlfFmP+q3SMn2mrbBLHV26V6aBso00pty24+PQlF1IcKn5AbLOEGKwqQnvdorvEg==',key_name='tempest-TestNetworkBasicOps-1350177732',keypairs=<?>,launch_index=0,launched_at=2025-12-11T10:00:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-58qfon3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T10:00:58Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=15b95a61-2ec8-42a7-92b4-e0094e84b686,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.922 187010 DEBUG nova.network.os_vif_util [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.922 187010 DEBUG nova.network.os_vif_util [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.923 187010 DEBUG os_vif [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.926 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.927 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap466e5560-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.931 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.934 187010 INFO os_vif [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:5b:75,bridge_name='br-int',has_traffic_filtering=True,id=466e5560-fb15-4c0a-9532-8bcc93782074,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap466e5560-fb')#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.935 187010 INFO nova.virt.libvirt.driver [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Deleting instance files /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686_del#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.936 187010 INFO nova.virt.libvirt.driver [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Deletion of /var/lib/nova/instances/15b95a61-2ec8-42a7-92b4-e0094e84b686_del complete#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.985 187010 INFO nova.compute.manager [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.986 187010 DEBUG oslo.service.loopingcall [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.986 187010 DEBUG nova.compute.manager [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 05:01:23 np0005555140 nova_compute[187006]: 2025-12-11 10:01:23.986 187010 DEBUG nova.network.neutron [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.547 187010 DEBUG nova.network.neutron [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.569 187010 INFO nova.compute.manager [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Took 0.58 seconds to deallocate network for instance.#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.633 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.633 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.724 187010 DEBUG nova.compute.provider_tree [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.739 187010 DEBUG nova.scheduler.client.report [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.766 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.788 187010 INFO nova.scheduler.client.report [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 15b95a61-2ec8-42a7-92b4-e0094e84b686#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.864 187010 DEBUG oslo_concurrency.lockutils [None req-8c2f7c56-c4e2-42c9-986b-2db0e95dfa07 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.886 187010 DEBUG nova.network.neutron [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updated VIF entry in instance network info cache for port 466e5560-fb15-4c0a-9532-8bcc93782074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.886 187010 DEBUG nova.network.neutron [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Updating instance_info_cache with network_info: [{"id": "466e5560-fb15-4c0a-9532-8bcc93782074", "address": "fa:16:3e:6c:5b:75", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap466e5560-fb", "ovs_interfaceid": "466e5560-fb15-4c0a-9532-8bcc93782074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.902 187010 DEBUG oslo_concurrency.lockutils [req-c27bc8a1-87fe-4b4f-8c4b-3f005222aa4a req-e2c77e10-5d92-4468-8c9f-72cfad3250ec b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-15b95a61-2ec8-42a7-92b4-e0094e84b686" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.985 187010 DEBUG nova.network.neutron [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated VIF entry in instance network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:24 np0005555140 nova_compute[187006]: 2025-12-11 10:01:24.986 187010 DEBUG nova.network.neutron [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.001 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.002 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.002 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.002 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.002 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.002 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.003 187010 WARNING nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.003 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.003 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.003 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.003 187010 DEBUG oslo_concurrency.lockutils [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.004 187010 DEBUG nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.004 187010 WARNING nova.compute.manager [req-27953c8a-dfa1-4ec7-a61b-d5b9631eed14 req-c4383f47-f4f3-4ece-b804-1d1d241c228d b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:25 np0005555140 podman[218929]: 2025-12-11 10:01:25.697421635 +0000 UTC m=+0.063292325 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 11 05:01:25 np0005555140 podman[218928]: 2025-12-11 10:01:25.732436428 +0000 UTC m=+0.102358394 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.735 187010 DEBUG nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-vif-unplugged-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.736 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.737 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.737 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.737 187010 DEBUG nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] No waiting events found dispatching network-vif-unplugged-466e5560-fb15-4c0a-9532-8bcc93782074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.737 187010 WARNING nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received unexpected event network-vif-unplugged-466e5560-fb15-4c0a-9532-8bcc93782074 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.738 187010 DEBUG nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.738 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.738 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.739 187010 DEBUG oslo_concurrency.lockutils [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "15b95a61-2ec8-42a7-92b4-e0094e84b686-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.739 187010 DEBUG nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] No waiting events found dispatching network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.739 187010 WARNING nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received unexpected event network-vif-plugged-466e5560-fb15-4c0a-9532-8bcc93782074 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:01:25 np0005555140 nova_compute[187006]: 2025-12-11 10:01:25.739 187010 DEBUG nova.compute.manager [req-1ffa5fa3-bccb-46ff-b6a6-1771ce763188 req-57da8e6b-7d50-4b2c-8f3e-a1505ab63ea6 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Received event network-vif-deleted-466e5560-fb15-4c0a-9532-8bcc93782074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.843 187010 DEBUG nova.compute.manager [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.843 187010 DEBUG nova.compute.manager [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing instance network info cache due to event network-changed-af78ff97-5ade-4b55-ab3d-05399536509f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.844 187010 DEBUG oslo_concurrency.lockutils [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.844 187010 DEBUG oslo_concurrency.lockutils [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.844 187010 DEBUG nova.network.neutron [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Refreshing network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.890 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.924 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.925 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.925 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.925 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.926 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.927 187010 INFO nova.compute.manager [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Terminating instance#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.928 187010 DEBUG nova.compute.manager [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 05:01:26 np0005555140 kernel: tapaf78ff97-5a (unregistering): left promiscuous mode
Dec 11 05:01:26 np0005555140 NetworkManager[55531]: <info>  [1765447286.9558] device (tapaf78ff97-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.964 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:26 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:26Z|00165|binding|INFO|Releasing lport af78ff97-5ade-4b55-ab3d-05399536509f from this chassis (sb_readonly=0)
Dec 11 05:01:26 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:26Z|00166|binding|INFO|Setting lport af78ff97-5ade-4b55-ab3d-05399536509f down in Southbound
Dec 11 05:01:26 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:26Z|00167|binding|INFO|Removing iface tapaf78ff97-5a ovn-installed in OVS
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.966 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:26.970 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:91:c4 10.100.0.5'], port_security=['fa:16:3e:23:91:c4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e7fc787e-783c-48a9-947b-cb7d8a412e60', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8d836b6d-b2d8-4d99-ba32-610089cbc32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c13e7c47-09f9-4a17-81fa-f5bc1627478a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=af78ff97-5ade-4b55-ab3d-05399536509f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:01:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:26.974 104288 INFO neutron.agent.ovn.metadata.agent [-] Port af78ff97-5ade-4b55-ab3d-05399536509f in datapath 5a47ac59-4586-4282-be2b-545a8e2d8aa8 unbound from our chassis#033[00m
Dec 11 05:01:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:26.975 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5a47ac59-4586-4282-be2b-545a8e2d8aa8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 05:01:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:26.976 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[36bd39c2-b684-4617-a066-bcdd0cbb0dea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:26.977 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8 namespace which is not needed anymore#033[00m
Dec 11 05:01:26 np0005555140 nova_compute[187006]: 2025-12-11 10:01:26.980 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 11 05:01:27 np0005555140 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 13.712s CPU time.
Dec 11 05:01:27 np0005555140 systemd-machined[153398]: Machine qemu-11-instance-0000000b terminated.
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.150 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [NOTICE]   (218522) : haproxy version is 2.8.14-c23fe91
Dec 11 05:01:27 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [NOTICE]   (218522) : path to executable is /usr/sbin/haproxy
Dec 11 05:01:27 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [WARNING]  (218522) : Exiting Master process...
Dec 11 05:01:27 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [ALERT]    (218522) : Current worker (218524) exited with code 143 (Terminated)
Dec 11 05:01:27 np0005555140 neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8[218518]: [WARNING]  (218522) : All workers exited. Exiting... (0)
Dec 11 05:01:27 np0005555140 systemd[1]: libpod-f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877.scope: Deactivated successfully.
Dec 11 05:01:27 np0005555140 podman[218998]: 2025-12-11 10:01:27.164713817 +0000 UTC m=+0.068924096 container died f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.190 187010 INFO nova.virt.libvirt.driver [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Instance destroyed successfully.#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.190 187010 DEBUG nova.objects.instance [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid e7fc787e-783c-48a9-947b-cb7d8a412e60 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:01:27 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877-userdata-shm.mount: Deactivated successfully.
Dec 11 05:01:27 np0005555140 systemd[1]: var-lib-containers-storage-overlay-f5d77b7bd26f2cc9c7e5195876da67d49b12b28004376f4a9fbe1df5d670c4ff-merged.mount: Deactivated successfully.
Dec 11 05:01:27 np0005555140 podman[218998]: 2025-12-11 10:01:27.209235302 +0000 UTC m=+0.113445541 container cleanup f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:01:27 np0005555140 systemd[1]: libpod-conmon-f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877.scope: Deactivated successfully.
Dec 11 05:01:27 np0005555140 podman[219045]: 2025-12-11 10:01:27.27374527 +0000 UTC m=+0.039720779 container remove f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.278 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[841578e6-5663-47bb-8a3c-1d3f46f5dfb0]: (4, ('Thu Dec 11 10:01:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8 (f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877)\nf6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877\nThu Dec 11 10:01:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8 (f6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877)\nf6d420c2ab020fd99f8136bd7f5b467bc616b5460720280e8307e381dfe18877\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.280 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f1deb7f5-5535-4076-874f-10e09c0b4792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.280 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5a47ac59-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.282 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 kernel: tap5a47ac59-40: left promiscuous mode
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.296 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.298 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[7cadd34a-565e-4f0b-9d38-a1e17d91657b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.321 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e8895917-e4a9-4d6a-82d5-5727a568b9c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.323 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e095b8c7-3977-4188-8ff6-897ad00f46b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.336 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[c43987d9-06e6-4a5d-b49d-994cb24b685c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 359390, 'reachable_time': 36518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219065, 'error': None, 'target': 'ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.338 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5a47ac59-4586-4282-be2b-545a8e2d8aa8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 05:01:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:27.338 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[84b35bb3-3060-4c68-93ed-e06092b26387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:27 np0005555140 systemd[1]: run-netns-ovnmeta\x2d5a47ac59\x2d4586\x2d4282\x2dbe2b\x2d545a8e2d8aa8.mount: Deactivated successfully.
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.364 187010 DEBUG nova.virt.libvirt.vif [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T10:00:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1717863907',display_name='tempest-TestNetworkBasicOps-server-1717863907',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1717863907',id=11,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJMBSIhZ4EVil7AXrHPe22R5NehWPSWuc1SalCvSn9cIqHbM1CDGz5cixIH9uspQSK0YbE0eVVIj4M3PRbSdv5mbOULzw4alwWqm1ii8fiAxnjVOpwfUfrb2MBIDBIX+nw==',key_name='tempest-TestNetworkBasicOps-1529837035',keypairs=<?>,launch_index=0,launched_at=2025-12-11T10:00:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-6j1yq68b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T10:00:35Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=e7fc787e-783c-48a9-947b-cb7d8a412e60,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.365 187010 DEBUG nova.network.os_vif_util [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.366 187010 DEBUG nova.network.os_vif_util [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.367 187010 DEBUG os_vif [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.368 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.369 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf78ff97-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.371 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.374 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.376 187010 INFO os_vif [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:91:c4,bridge_name='br-int',has_traffic_filtering=True,id=af78ff97-5ade-4b55-ab3d-05399536509f,network=Network(5a47ac59-4586-4282-be2b-545a8e2d8aa8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf78ff97-5a')#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.376 187010 INFO nova.virt.libvirt.driver [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Deleting instance files /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60_del#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.377 187010 INFO nova.virt.libvirt.driver [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Deletion of /var/lib/nova/instances/e7fc787e-783c-48a9-947b-cb7d8a412e60_del complete#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.440 187010 INFO nova.compute.manager [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.441 187010 DEBUG oslo.service.loopingcall [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.441 187010 DEBUG nova.compute.manager [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 05:01:27 np0005555140 nova_compute[187006]: 2025-12-11 10:01:27.442 187010 DEBUG nova.network.neutron [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.444 187010 DEBUG nova.compute.manager [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.445 187010 DEBUG oslo_concurrency.lockutils [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.445 187010 DEBUG oslo_concurrency.lockutils [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.446 187010 DEBUG oslo_concurrency.lockutils [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.446 187010 DEBUG nova.compute.manager [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.447 187010 DEBUG nova.compute.manager [req-82b3150b-69d2-4fa3-adf4-20637868f799 req-006878a4-6553-48b0-b612-ec42327ce807 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-unplugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.746 187010 DEBUG nova.network.neutron [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.768 187010 INFO nova.compute.manager [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Took 1.33 seconds to deallocate network for instance.#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.810 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.811 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.864 187010 DEBUG nova.compute.provider_tree [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.879 187010 DEBUG nova.scheduler.client.report [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.902 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.923 187010 INFO nova.scheduler.client.report [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance e7fc787e-783c-48a9-947b-cb7d8a412e60#033[00m
Dec 11 05:01:28 np0005555140 nova_compute[187006]: 2025-12-11 10:01:28.978 187010 DEBUG oslo_concurrency.lockutils [None req-249077cc-75b4-4f3e-b408-22c61b429fd0 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.135 187010 DEBUG nova.network.neutron [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updated VIF entry in instance network info cache for port af78ff97-5ade-4b55-ab3d-05399536509f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.135 187010 DEBUG nova.network.neutron [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Updating instance_info_cache with network_info: [{"id": "af78ff97-5ade-4b55-ab3d-05399536509f", "address": "fa:16:3e:23:91:c4", "network": {"id": "5a47ac59-4586-4282-be2b-545a8e2d8aa8", "bridge": "br-int", "label": "tempest-network-smoke--1040547812", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf78ff97-5a", "ovs_interfaceid": "af78ff97-5ade-4b55-ab3d-05399536509f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.152 187010 DEBUG oslo_concurrency.lockutils [req-eba240b0-2e66-49ec-ace3-6c20aa978a20 req-bf47ecb1-9696-4e2e-a3e4-a484aab862c4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-e7fc787e-783c-48a9-947b-cb7d8a412e60" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.543 187010 DEBUG nova.compute.manager [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.544 187010 DEBUG oslo_concurrency.lockutils [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.545 187010 DEBUG oslo_concurrency.lockutils [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.545 187010 DEBUG oslo_concurrency.lockutils [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "e7fc787e-783c-48a9-947b-cb7d8a412e60-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.546 187010 DEBUG nova.compute.manager [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] No waiting events found dispatching network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.547 187010 WARNING nova.compute.manager [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received unexpected event network-vif-plugged-af78ff97-5ade-4b55-ab3d-05399536509f for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:01:30 np0005555140 nova_compute[187006]: 2025-12-11 10:01:30.547 187010 DEBUG nova.compute.manager [req-30aea9b8-456e-44cc-b53b-ebb2e619d1ce req-1830c43f-2547-41d9-aeaf-84b68f2d4976 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Received event network-vif-deleted-af78ff97-5ade-4b55-ab3d-05399536509f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:31 np0005555140 nova_compute[187006]: 2025-12-11 10:01:31.892 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:32 np0005555140 nova_compute[187006]: 2025-12-11 10:01:32.371 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:34 np0005555140 nova_compute[187006]: 2025-12-11 10:01:34.043 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:34 np0005555140 nova_compute[187006]: 2025-12-11 10:01:34.112 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:35 np0005555140 podman[219067]: 2025-12-11 10:01:35.695926116 +0000 UTC m=+0.067188176 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:01:36 np0005555140 nova_compute[187006]: 2025-12-11 10:01:36.893 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:37 np0005555140 nova_compute[187006]: 2025-12-11 10:01:37.372 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:38 np0005555140 nova_compute[187006]: 2025-12-11 10:01:38.908 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447283.9062939, 15b95a61-2ec8-42a7-92b4-e0094e84b686 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:01:38 np0005555140 nova_compute[187006]: 2025-12-11 10:01:38.909 187010 INFO nova.compute.manager [-] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] VM Stopped (Lifecycle Event)#033[00m
Dec 11 05:01:38 np0005555140 nova_compute[187006]: 2025-12-11 10:01:38.933 187010 DEBUG nova.compute.manager [None req-5cebce52-5a39-4a19-ad1a-01c54dbcd095 - - - - - -] [instance: 15b95a61-2ec8-42a7-92b4-e0094e84b686] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:41 np0005555140 nova_compute[187006]: 2025-12-11 10:01:41.896 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:42 np0005555140 nova_compute[187006]: 2025-12-11 10:01:42.189 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447287.1881545, e7fc787e-783c-48a9-947b-cb7d8a412e60 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:01:42 np0005555140 nova_compute[187006]: 2025-12-11 10:01:42.190 187010 INFO nova.compute.manager [-] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] VM Stopped (Lifecycle Event)#033[00m
Dec 11 05:01:42 np0005555140 nova_compute[187006]: 2025-12-11 10:01:42.374 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:42 np0005555140 nova_compute[187006]: 2025-12-11 10:01:42.890 187010 DEBUG nova.compute.manager [None req-c974b531-b478-4a31-ac34-1ffd6f4f733c - - - - - -] [instance: e7fc787e-783c-48a9-947b-cb7d8a412e60] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:43 np0005555140 podman[219092]: 2025-12-11 10:01:43.701717805 +0000 UTC m=+0.069511203 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:01:43 np0005555140 podman[219091]: 2025-12-11 10:01:43.709689773 +0000 UTC m=+0.076429830 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 05:01:46 np0005555140 nova_compute[187006]: 2025-12-11 10:01:46.898 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:47 np0005555140 nova_compute[187006]: 2025-12-11 10:01:47.376 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:48.627 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:48.628 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:48.628 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:48 np0005555140 podman[219130]: 2025-12-11 10:01:48.687193684 +0000 UTC m=+0.061187854 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.182 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.182 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.198 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.275 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.275 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.283 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.284 187010 INFO nova.compute.claims [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.386 187010 DEBUG nova.compute.provider_tree [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.403 187010 DEBUG nova.scheduler.client.report [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.425 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.426 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.472 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.473 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.543 187010 INFO nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.562 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.625 187010 DEBUG nova.policy [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '277eaa28c80b403abb371276e6721821', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.641 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.642 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.643 187010 INFO nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Creating image(s)#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.643 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.644 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.645 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.660 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.728 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.729 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.730 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.745 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.831 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.832 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.880 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b,backing_fmt=raw /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.881 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "dad9610605a2d7b1b8eb832e99f295d13d58693b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.882 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.942 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dad9610605a2d7b1b8eb832e99f295d13d58693b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.943 187010 DEBUG nova.virt.disk.api [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Checking if we can resize image /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 05:01:49 np0005555140 nova_compute[187006]: 2025-12-11 10:01:49.943 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.012 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.013 187010 DEBUG nova.virt.disk.api [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Cannot resize image /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.013 187010 DEBUG nova.objects.instance [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'migration_context' on Instance uuid 5237d116-713e-4af3-822e-8ce58f99769b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.031 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.031 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Ensure instance console log exists: /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.032 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.032 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.032 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:01:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:01:50 np0005555140 nova_compute[187006]: 2025-12-11 10:01:50.464 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Successfully created port: daea0190-bb25-40bb-be7e-c2d97807d3f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 05:01:51 np0005555140 nova_compute[187006]: 2025-12-11 10:01:51.901 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:52 np0005555140 nova_compute[187006]: 2025-12-11 10:01:52.378 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.124 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Successfully updated port: daea0190-bb25-40bb-be7e-c2d97807d3f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.140 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.141 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquired lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.141 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.205 187010 DEBUG nova.compute.manager [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.206 187010 DEBUG nova.compute.manager [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing instance network info cache due to event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.206 187010 DEBUG oslo_concurrency.lockutils [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.274 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.857 187010 DEBUG nova.network.neutron [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.878 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Releasing lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.879 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Instance network_info: |[{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.879 187010 DEBUG oslo_concurrency.lockutils [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.880 187010 DEBUG nova.network.neutron [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.885 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Start _get_guest_xml network_info=[{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'guest_format': None, 'image_id': '9e66a2ab-a034-4869-91a9-a90f37915272'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.892 187010 WARNING nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.903 187010 DEBUG nova.virt.libvirt.host [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.904 187010 DEBUG nova.virt.libvirt.host [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.908 187010 DEBUG nova.virt.libvirt.host [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.909 187010 DEBUG nova.virt.libvirt.host [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.910 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.910 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T09:51:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8ceb5bb7-cd53-4ae6-a352-a5023850ca5b',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T09:51:08Z,direct_url=<?>,disk_format='qcow2',id=9e66a2ab-a034-4869-91a9-a90f37915272,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='41851b7da4be4b6e8751110baa8eccbc',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T09:51:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.911 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.911 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.911 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.911 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.912 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.912 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.912 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.913 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.913 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.913 187010 DEBUG nova.virt.hardware [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.918 187010 DEBUG nova.virt.libvirt.vif [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-724408234',display_name='tempest-TestNetworkBasicOps-server-724408234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-724408234',id=13,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlQE9NuMgPWLd69eIoEzGei6dH+gFu2IqfKoco1dXOtcxjIxkhWfCpbUuy28MboW3aL/ukwb4phezYjzzhuUxtUbTStGtiCEk70Tw73ifJHFz1jPHNp79dYmGdVRC6lHA==',key_name='tempest-TestNetworkBasicOps-536195300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-cpqmh0yw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:01:49Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=5237d116-713e-4af3-822e-8ce58f99769b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.919 187010 DEBUG nova.network.os_vif_util [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.920 187010 DEBUG nova.network.os_vif_util [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.921 187010 DEBUG nova.objects.instance [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5237d116-713e-4af3-822e-8ce58f99769b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.933 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] End _get_guest_xml xml=<domain type="kvm">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <uuid>5237d116-713e-4af3-822e-8ce58f99769b</uuid>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <name>instance-0000000d</name>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <memory>131072</memory>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <vcpu>1</vcpu>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <metadata>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:name>tempest-TestNetworkBasicOps-server-724408234</nova:name>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:creationTime>2025-12-11 10:01:53</nova:creationTime>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:flavor name="m1.nano">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:memory>128</nova:memory>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:disk>1</nova:disk>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:swap>0</nova:swap>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:vcpus>1</nova:vcpus>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      </nova:flavor>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:owner>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:user uuid="277eaa28c80b403abb371276e6721821">tempest-TestNetworkBasicOps-1206359647-project-member</nova:user>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:project uuid="da6c6741ea8e45ae95d918e6da5f248b">tempest-TestNetworkBasicOps-1206359647</nova:project>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      </nova:owner>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:root type="image" uuid="9e66a2ab-a034-4869-91a9-a90f37915272"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <nova:ports>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        <nova:port uuid="daea0190-bb25-40bb-be7e-c2d97807d3f3">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:        </nova:port>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      </nova:ports>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </nova:instance>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </metadata>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <sysinfo type="smbios">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <system>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="manufacturer">RDO</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="product">OpenStack Compute</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="serial">5237d116-713e-4af3-822e-8ce58f99769b</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="uuid">5237d116-713e-4af3-822e-8ce58f99769b</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <entry name="family">Virtual Machine</entry>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </system>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </sysinfo>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <os>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <boot dev="hd"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <smbios mode="sysinfo"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </os>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <features>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <acpi/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <apic/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <vmcoreinfo/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </features>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <clock offset="utc">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <timer name="hpet" present="no"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </clock>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <cpu mode="host-model" match="exact">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </cpu>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  <devices>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <disk type="file" device="disk">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <target dev="vda" bus="virtio"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <disk type="file" device="cdrom">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <source file="/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.config"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <target dev="sda" bus="sata"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </disk>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <interface type="ethernet">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <mac address="fa:16:3e:0b:22:fb"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <mtu size="1442"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <target dev="tapdaea0190-bb"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </interface>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <serial type="pty">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <log file="/var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/console.log" append="off"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </serial>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <video>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <model type="virtio"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </video>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <input type="tablet" bus="usb"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <rng model="virtio">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <backend model="random">/dev/urandom</backend>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </rng>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <controller type="usb" index="0"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    <memballoon model="virtio">
Dec 11 05:01:53 np0005555140 nova_compute[187006]:      <stats period="10"/>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:    </memballoon>
Dec 11 05:01:53 np0005555140 nova_compute[187006]:  </devices>
Dec 11 05:01:53 np0005555140 nova_compute[187006]: </domain>
Dec 11 05:01:53 np0005555140 nova_compute[187006]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.935 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Preparing to wait for external event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.935 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.936 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.936 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.937 187010 DEBUG nova.virt.libvirt.vif [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T10:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-724408234',display_name='tempest-TestNetworkBasicOps-server-724408234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-724408234',id=13,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlQE9NuMgPWLd69eIoEzGei6dH+gFu2IqfKoco1dXOtcxjIxkhWfCpbUuy28MboW3aL/ukwb4phezYjzzhuUxtUbTStGtiCEk70Tw73ifJHFz1jPHNp79dYmGdVRC6lHA==',key_name='tempest-TestNetworkBasicOps-536195300',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-cpqmh0yw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T10:01:49Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=5237d116-713e-4af3-822e-8ce58f99769b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.937 187010 DEBUG nova.network.os_vif_util [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.938 187010 DEBUG nova.network.os_vif_util [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.938 187010 DEBUG os_vif [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.939 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.940 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.940 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.943 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.943 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdaea0190-bb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.944 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdaea0190-bb, col_values=(('external_ids', {'iface-id': 'daea0190-bb25-40bb-be7e-c2d97807d3f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:22:fb', 'vm-uuid': '5237d116-713e-4af3-822e-8ce58f99769b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.945 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:53 np0005555140 NetworkManager[55531]: <info>  [1765447313.9465] manager: (tapdaea0190-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.949 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:53 np0005555140 nova_compute[187006]: 2025-12-11 10:01:53.953 187010 INFO os_vif [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb')#033[00m
Dec 11 05:01:54 np0005555140 nova_compute[187006]: 2025-12-11 10:01:54.004 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:01:54 np0005555140 nova_compute[187006]: 2025-12-11 10:01:54.005 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 05:01:54 np0005555140 nova_compute[187006]: 2025-12-11 10:01:54.005 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] No VIF found with MAC fa:16:3e:0b:22:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 05:01:54 np0005555140 nova_compute[187006]: 2025-12-11 10:01:54.006 187010 INFO nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Using config drive#033[00m
Dec 11 05:01:54 np0005555140 podman[219170]: 2025-12-11 10:01:54.087114037 +0000 UTC m=+0.090147854 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:01:55 np0005555140 nova_compute[187006]: 2025-12-11 10:01:55.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:55 np0005555140 nova_compute[187006]: 2025-12-11 10:01:55.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:55 np0005555140 nova_compute[187006]: 2025-12-11 10:01:55.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 05:01:55 np0005555140 nova_compute[187006]: 2025-12-11 10:01:55.952 187010 INFO nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Creating config drive at /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.config#033[00m
Dec 11 05:01:55 np0005555140 nova_compute[187006]: 2025-12-11 10:01:55.964 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9n6yac1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.108 187010 DEBUG oslo_concurrency.processutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9n6yac1" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:01:56 np0005555140 kernel: tapdaea0190-bb: entered promiscuous mode
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.1873] manager: (tapdaea0190-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Dec 11 05:01:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:56Z|00168|binding|INFO|Claiming lport daea0190-bb25-40bb-be7e-c2d97807d3f3 for this chassis.
Dec 11 05:01:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:56Z|00169|binding|INFO|daea0190-bb25-40bb-be7e-c2d97807d3f3: Claiming fa:16:3e:0b:22:fb 10.100.0.13
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.184 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.188 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.200 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:22:fb 10.100.0.13'], port_security=['fa:16:3e:0b:22:fb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5237d116-713e-4af3-822e-8ce58f99769b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6329c3dd-8512-4801-95ae-a4417217513f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0624eb4c-626c-4f6a-ad36-1d600829e141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e66aee6f-5bb2-406b-adc0-0e40e0e78599, chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=daea0190-bb25-40bb-be7e-c2d97807d3f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.201 104288 INFO neutron.agent.ovn.metadata.agent [-] Port daea0190-bb25-40bb-be7e-c2d97807d3f3 in datapath 6329c3dd-8512-4801-95ae-a4417217513f bound to our chassis#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.203 104288 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6329c3dd-8512-4801-95ae-a4417217513f#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.217 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[c9af3ec1-79ab-41a9-8bd3-7b331b1af60b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.218 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6329c3dd-81 in ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 05:01:56 np0005555140 systemd-udevd[219229]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.222 213337 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6329c3dd-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.222 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[4892bb5b-c79a-4d11-aff8-b8bb472ac283]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.223 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5c850a6f-6b4a-4c13-a53a-d8d400f5946d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.2384] device (tapdaea0190-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.2394] device (tapdaea0190-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.241 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[6d8fdfc7-71f8-4cbd-8980-b3f7ec5b50b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.243 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 systemd-machined[153398]: New machine qemu-13-instance-0000000d.
Dec 11 05:01:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:56Z|00170|binding|INFO|Setting lport daea0190-bb25-40bb-be7e-c2d97807d3f3 ovn-installed in OVS
Dec 11 05:01:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:56Z|00171|binding|INFO|Setting lport daea0190-bb25-40bb-be7e-c2d97807d3f3 up in Southbound
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.251 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.261 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef4e58b-3f5d-40be-be80-528c4f0e3fcf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.290 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[81440161-9d69-4a4f-96fe-30ea6edf62df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 podman[219205]: 2025-12-11 10:01:56.294649621 +0000 UTC m=+0.119543436 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.297 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5c832c-e81e-43a2-be3b-e0c32a751195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.2978] manager: (tap6329c3dd-80): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Dec 11 05:01:56 np0005555140 podman[219206]: 2025-12-11 10:01:56.325368141 +0000 UTC m=+0.146925431 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.325 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[ccbed721-9bf9-495c-97f0-4d7aa3eb38bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.327 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[ef165aa7-22a6-44e7-9651-909c6fcf4ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.3458] device (tap6329c3dd-80): carrier: link connected
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.349 213358 DEBUG oslo.privsep.daemon [-] privsep: reply[25f1234c-cea0-420d-97d8-9b87dd661a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.362 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4d7183-c996-4098-945a-76d8c7477d12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6329c3dd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:09:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367491, 'reachable_time': 34669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219292, 'error': None, 'target': 'ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.374 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[5d825b42-a180-4b55-ad23-a6c0413c53e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:9d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367491, 'tstamp': 367491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219293, 'error': None, 'target': 'ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.390 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc3dda6-c4f1-4f1b-9816-8b2cb480f3a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6329c3dd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:09:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367491, 'reachable_time': 34669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219294, 'error': None, 'target': 'ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.416 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e8867c58-4d22-4681-970c-e6e9b097d20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.467 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7a8225-e2a1-45a4-a605-924152f6556b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.468 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6329c3dd-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.469 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.469 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6329c3dd-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:56 np0005555140 NetworkManager[55531]: <info>  [1765447316.4720] manager: (tap6329c3dd-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Dec 11 05:01:56 np0005555140 kernel: tap6329c3dd-80: entered promiscuous mode
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.475 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6329c3dd-80, col_values=(('external_ids', {'iface-id': '0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:01:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:01:56Z|00172|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.490 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.496 104288 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6329c3dd-8512-4801-95ae-a4417217513f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6329c3dd-8512-4801-95ae-a4417217513f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.497 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3a210b8c-8ddc-4ae5-9fe8-ac5ad7bb1a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.498 104288 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: global
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    log         /dev/log local0 debug
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    log-tag     haproxy-metadata-proxy-6329c3dd-8512-4801-95ae-a4417217513f
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    user        root
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    group       root
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    maxconn     1024
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    pidfile     /var/lib/neutron/external/pids/6329c3dd-8512-4801-95ae-a4417217513f.pid.haproxy
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    daemon
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: defaults
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    log global
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    mode http
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    option httplog
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    option dontlognull
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    option http-server-close
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    option forwardfor
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.498 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    retries                 3
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    timeout http-request    30s
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    timeout connect         30s
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    timeout client          32s
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    timeout server          32s
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    timeout http-keep-alive 30s
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: listen listener
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    bind 169.254.169.254:80
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]:    http-request add-header X-OVN-Network-ID 6329c3dd-8512-4801-95ae-a4417217513f
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 05:01:56 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:01:56.499 104288 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f', 'env', 'PROCESS_TAG=haproxy-6329c3dd-8512-4801-95ae-a4417217513f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6329c3dd-8512-4801-95ae-a4417217513f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.786 187010 DEBUG nova.compute.manager [req-ef77946d-de26-436e-894a-245edb01fe55 req-fb68bb95-765c-493a-a8ae-bbbd49920023 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.787 187010 DEBUG oslo_concurrency.lockutils [req-ef77946d-de26-436e-894a-245edb01fe55 req-fb68bb95-765c-493a-a8ae-bbbd49920023 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.787 187010 DEBUG oslo_concurrency.lockutils [req-ef77946d-de26-436e-894a-245edb01fe55 req-fb68bb95-765c-493a-a8ae-bbbd49920023 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.787 187010 DEBUG oslo_concurrency.lockutils [req-ef77946d-de26-436e-894a-245edb01fe55 req-fb68bb95-765c-493a-a8ae-bbbd49920023 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.788 187010 DEBUG nova.compute.manager [req-ef77946d-de26-436e-894a-245edb01fe55 req-fb68bb95-765c-493a-a8ae-bbbd49920023 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Processing event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:56 np0005555140 podman[219323]: 2025-12-11 10:01:56.865139725 +0000 UTC m=+0.052996019 container create c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Dec 11 05:01:56 np0005555140 systemd[1]: Started libpod-conmon-c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d.scope.
Dec 11 05:01:56 np0005555140 nova_compute[187006]: 2025-12-11 10:01:56.902 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:56 np0005555140 systemd[1]: Started libcrun container.
Dec 11 05:01:56 np0005555140 podman[219323]: 2025-12-11 10:01:56.833177879 +0000 UTC m=+0.021034193 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Dec 11 05:01:56 np0005555140 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ef470c4c63710a5e5e4ede4f778e165b355fe949a00039fd44b191499165c91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 05:01:56 np0005555140 podman[219323]: 2025-12-11 10:01:56.939758973 +0000 UTC m=+0.127615287 container init c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 05:01:56 np0005555140 podman[219323]: 2025-12-11 10:01:56.944704874 +0000 UTC m=+0.132561168 container start c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:01:56 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [NOTICE]   (219343) : New worker (219345) forked
Dec 11 05:01:56 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [NOTICE]   (219343) : Loading success.
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.091 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.092 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447317.0907745, 5237d116-713e-4af3-822e-8ce58f99769b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.093 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] VM Started (Lifecycle Event)#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.097 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.103 187010 INFO nova.virt.libvirt.driver [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Instance spawned successfully.#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.103 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.166 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.175 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.183 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.184 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.185 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.185 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.186 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.187 187010 DEBUG nova.virt.libvirt.driver [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.231 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.232 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447317.091813, 5237d116-713e-4af3-822e-8ce58f99769b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.233 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] VM Paused (Lifecycle Event)#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.268 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.273 187010 DEBUG nova.virt.driver [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] Emitting event <LifecycleEvent: 1765447317.0971527, 5237d116-713e-4af3-822e-8ce58f99769b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.274 187010 INFO nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] VM Resumed (Lifecycle Event)#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.285 187010 INFO nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Took 7.64 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.286 187010 DEBUG nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.455 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.459 187010 DEBUG nova.compute.manager [None req-3b1f35ab-0a06-4c56-9ef7-723ddeeeae88 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.496 187010 INFO nova.compute.manager [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Took 8.25 seconds to build instance.#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.513 187010 DEBUG oslo_concurrency.lockutils [None req-74cdcef9-74bc-406a-b4fa-8687fd9c32c7 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:57 np0005555140 nova_compute[187006]: 2025-12-11 10:01:57.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.119 187010 DEBUG nova.network.neutron [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updated VIF entry in instance network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.120 187010 DEBUG nova.network.neutron [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.139 187010 DEBUG oslo_concurrency.lockutils [req-421aa6de-5e1f-463b-8bdd-05632fc7d087 req-fbf74d79-e251-4140-87f1-8f871a181cd4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.882 187010 DEBUG nova.compute.manager [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.882 187010 DEBUG oslo_concurrency.lockutils [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.883 187010 DEBUG oslo_concurrency.lockutils [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.883 187010 DEBUG oslo_concurrency.lockutils [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.883 187010 DEBUG nova.compute.manager [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] No waiting events found dispatching network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.883 187010 WARNING nova.compute.manager [req-4d0f30d1-05fc-4938-80df-fab3be471dc1 req-0f34a5f8-2cac-40a4-ac40-1d73aebfead1 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received unexpected event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 for instance with vm_state active and task_state None.#033[00m
Dec 11 05:01:58 np0005555140 nova_compute[187006]: 2025-12-11 10:01:58.947 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:01:59 np0005555140 nova_compute[187006]: 2025-12-11 10:01:59.929 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:01:59 np0005555140 nova_compute[187006]: 2025-12-11 10:01:59.930 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquired lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:01:59 np0005555140 nova_compute[187006]: 2025-12-11 10:01:59.931 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 05:01:59 np0005555140 nova_compute[187006]: 2025-12-11 10:01:59.931 187010 DEBUG nova.objects.instance [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5237d116-713e-4af3-822e-8ce58f99769b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:02:01 np0005555140 nova_compute[187006]: 2025-12-11 10:02:01.904 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:03 np0005555140 nova_compute[187006]: 2025-12-11 10:02:03.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.929 187010 DEBUG nova.network.neutron [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.955 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Releasing lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.956 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.957 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.957 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.957 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.958 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.958 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.983 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.983 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.984 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:04 np0005555140 nova_compute[187006]: 2025-12-11 10:02:04.984 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.055 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.109 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.111 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.163 187010 DEBUG oslo_concurrency.processutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.323 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.325 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5576MB free_disk=73.32754135131836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.325 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.326 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.480 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Instance 5237d116-713e-4af3-822e-8ce58f99769b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.481 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.482 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.626 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.646 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.670 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.671 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.671 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.672 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 05:02:05 np0005555140 nova_compute[187006]: 2025-12-11 10:02:05.690 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 05:02:06 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:06Z|00173|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:02:06 np0005555140 nova_compute[187006]: 2025-12-11 10:02:06.329 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:06 np0005555140 NetworkManager[55531]: <info>  [1765447326.3309] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Dec 11 05:02:06 np0005555140 NetworkManager[55531]: <info>  [1765447326.3320] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec 11 05:02:06 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:06Z|00174|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:02:06 np0005555140 nova_compute[187006]: 2025-12-11 10:02:06.354 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:06 np0005555140 nova_compute[187006]: 2025-12-11 10:02:06.362 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:06 np0005555140 podman[219369]: 2025-12-11 10:02:06.684356736 +0000 UTC m=+0.054656587 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 05:02:06 np0005555140 nova_compute[187006]: 2025-12-11 10:02:06.906 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.029 187010 DEBUG nova.compute.manager [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.029 187010 DEBUG nova.compute.manager [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing instance network info cache due to event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.029 187010 DEBUG oslo_concurrency.lockutils [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.030 187010 DEBUG oslo_concurrency.lockutils [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.030 187010 DEBUG nova.network.neutron [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:02:07 np0005555140 nova_compute[187006]: 2025-12-11 10:02:07.687 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:07 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:07Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:22:fb 10.100.0.13
Dec 11 05:02:07 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:07Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:22:fb 10.100.0.13
Dec 11 05:02:08 np0005555140 nova_compute[187006]: 2025-12-11 10:02:08.408 187010 DEBUG nova.network.neutron [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updated VIF entry in instance network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:02:08 np0005555140 nova_compute[187006]: 2025-12-11 10:02:08.409 187010 DEBUG nova.network.neutron [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:02:08 np0005555140 nova_compute[187006]: 2025-12-11 10:02:08.426 187010 DEBUG oslo_concurrency.lockutils [req-a37a50ba-4f44-4405-ad1e-1d6222c27430 req-46861a1e-f92a-4df4-99fc-822700e95013 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:02:08 np0005555140 nova_compute[187006]: 2025-12-11 10:02:08.955 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:09 np0005555140 nova_compute[187006]: 2025-12-11 10:02:09.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:11 np0005555140 nova_compute[187006]: 2025-12-11 10:02:11.909 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:13 np0005555140 nova_compute[187006]: 2025-12-11 10:02:13.293 187010 INFO nova.compute.manager [None req-ce8cd12d-d870-47a6-948b-55b1949b794d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Get console output#033[00m
Dec 11 05:02:13 np0005555140 nova_compute[187006]: 2025-12-11 10:02:13.300 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:02:13 np0005555140 nova_compute[187006]: 2025-12-11 10:02:13.957 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:14 np0005555140 podman[219409]: 2025-12-11 10:02:14.692677796 +0000 UTC m=+0.060684819 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:02:14 np0005555140 podman[219410]: 2025-12-11 10:02:14.697067942 +0000 UTC m=+0.061968756 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 11 05:02:14 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:14Z|00175|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:02:14 np0005555140 nova_compute[187006]: 2025-12-11 10:02:14.942 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:14 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:14Z|00176|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:02:14 np0005555140 nova_compute[187006]: 2025-12-11 10:02:14.992 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:16 np0005555140 nova_compute[187006]: 2025-12-11 10:02:16.105 187010 INFO nova.compute.manager [None req-56abfdfa-f8e5-4dc6-b0e8-2651376b19ba 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Get console output#033[00m
Dec 11 05:02:16 np0005555140 nova_compute[187006]: 2025-12-11 10:02:16.109 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:02:16 np0005555140 nova_compute[187006]: 2025-12-11 10:02:16.911 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.052 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:18 np0005555140 NetworkManager[55531]: <info>  [1765447338.0531] manager: (patch-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 11 05:02:18 np0005555140 NetworkManager[55531]: <info>  [1765447338.0539] manager: (patch-br-int-to-provnet-7464eeb1-5066-4d33-92e7-2935c0e63a41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 11 05:02:18 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:18Z|00177|binding|INFO|Releasing lport 0689168e-f9e0-4ea8-a55a-ef3b57c5ee9e from this chassis (sb_readonly=0)
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.101 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.105 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.409 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:18 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:18.411 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:02:18 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:18.412 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.533 187010 INFO nova.compute.manager [None req-6867318c-d07b-4a14-a693-db4d88133891 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Get console output#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.539 213253 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 05:02:18 np0005555140 nova_compute[187006]: 2025-12-11 10:02:18.959 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.366 187010 DEBUG nova.compute.manager [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.366 187010 DEBUG nova.compute.manager [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing instance network info cache due to event network-changed-daea0190-bb25-40bb-be7e-c2d97807d3f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.366 187010 DEBUG oslo_concurrency.lockutils [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.366 187010 DEBUG oslo_concurrency.lockutils [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquired lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.367 187010 DEBUG nova.network.neutron [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Refreshing network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.457 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.457 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.458 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.458 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.458 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.460 187010 INFO nova.compute.manager [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Terminating instance#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.461 187010 DEBUG nova.compute.manager [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 05:02:19 np0005555140 kernel: tapdaea0190-bb (unregistering): left promiscuous mode
Dec 11 05:02:19 np0005555140 NetworkManager[55531]: <info>  [1765447339.4947] device (tapdaea0190-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 05:02:19 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:19Z|00178|binding|INFO|Releasing lport daea0190-bb25-40bb-be7e-c2d97807d3f3 from this chassis (sb_readonly=0)
Dec 11 05:02:19 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:19Z|00179|binding|INFO|Setting lport daea0190-bb25-40bb-be7e-c2d97807d3f3 down in Southbound
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.501 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:19Z|00180|binding|INFO|Removing iface tapdaea0190-bb ovn-installed in OVS
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.503 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.513 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:22:fb 10.100.0.13'], port_security=['fa:16:3e:0b:22:fb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5237d116-713e-4af3-822e-8ce58f99769b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6329c3dd-8512-4801-95ae-a4417217513f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da6c6741ea8e45ae95d918e6da5f248b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0624eb4c-626c-4f6a-ad36-1d600829e141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e66aee6f-5bb2-406b-adc0-0e40e0e78599, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>], logical_port=daea0190-bb25-40bb-be7e-c2d97807d3f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fce7d349d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.515 104288 INFO neutron.agent.ovn.metadata.agent [-] Port daea0190-bb25-40bb-be7e-c2d97807d3f3 in datapath 6329c3dd-8512-4801-95ae-a4417217513f unbound from our chassis#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.516 104288 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6329c3dd-8512-4801-95ae-a4417217513f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.517 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.517 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[dca96c0f-dd18-45dc-ab34-631f8632c083]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.518 104288 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f namespace which is not needed anymore#033[00m
Dec 11 05:02:19 np0005555140 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 11 05:02:19 np0005555140 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.919s CPU time.
Dec 11 05:02:19 np0005555140 systemd-machined[153398]: Machine qemu-13-instance-0000000d terminated.
Dec 11 05:02:19 np0005555140 podman[219450]: 2025-12-11 10:02:19.585510564 +0000 UTC m=+0.055250104 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.717 187010 INFO nova.virt.libvirt.driver [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Instance destroyed successfully.#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.718 187010 DEBUG nova.objects.instance [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lazy-loading 'resources' on Instance uuid 5237d116-713e-4af3-822e-8ce58f99769b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.736 187010 DEBUG nova.virt.libvirt.vif [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T10:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-724408234',display_name='tempest-TestNetworkBasicOps-server-724408234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-724408234',id=13,image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHlQE9NuMgPWLd69eIoEzGei6dH+gFu2IqfKoco1dXOtcxjIxkhWfCpbUuy28MboW3aL/ukwb4phezYjzzhuUxtUbTStGtiCEk70Tw73ifJHFz1jPHNp79dYmGdVRC6lHA==',key_name='tempest-TestNetworkBasicOps-536195300',keypairs=<?>,launch_index=0,launched_at=2025-12-11T10:01:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='da6c6741ea8e45ae95d918e6da5f248b',ramdisk_id='',reservation_id='r-cpqmh0yw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9e66a2ab-a034-4869-91a9-a90f37915272',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1206359647',owner_user_name='tempest-TestNetworkBasicOps-1206359647-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T10:01:57Z,user_data=None,user_id='277eaa28c80b403abb371276e6721821',uuid=5237d116-713e-4af3-822e-8ce58f99769b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.737 187010 DEBUG nova.network.os_vif_util [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converting VIF {"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.737 187010 DEBUG nova.network.os_vif_util [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.738 187010 DEBUG os_vif [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.739 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.739 187010 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdaea0190-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.741 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.743 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.744 187010 INFO os_vif [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:22:fb,bridge_name='br-int',has_traffic_filtering=True,id=daea0190-bb25-40bb-be7e-c2d97807d3f3,network=Network(6329c3dd-8512-4801-95ae-a4417217513f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaea0190-bb')#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.745 187010 INFO nova.virt.libvirt.driver [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Deleting instance files /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b_del#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.745 187010 INFO nova.virt.libvirt.driver [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Deletion of /var/lib/nova/instances/5237d116-713e-4af3-822e-8ce58f99769b_del complete#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.758 187010 DEBUG nova.compute.manager [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-unplugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.758 187010 DEBUG oslo_concurrency.lockutils [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.759 187010 DEBUG oslo_concurrency.lockutils [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.759 187010 DEBUG oslo_concurrency.lockutils [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.759 187010 DEBUG nova.compute.manager [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] No waiting events found dispatching network-vif-unplugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.759 187010 DEBUG nova.compute.manager [req-07701a04-f124-4d69-a2a4-ba6d97111ef8 req-8a7bc9b7-e37e-45a8-9e72-c7edcaf9d4e9 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-unplugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.808 187010 INFO nova.compute.manager [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.809 187010 DEBUG oslo.service.loopingcall [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.809 187010 DEBUG nova.compute.manager [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.809 187010 DEBUG nova.network.neutron [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [NOTICE]   (219343) : haproxy version is 2.8.14-c23fe91
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [NOTICE]   (219343) : path to executable is /usr/sbin/haproxy
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [WARNING]  (219343) : Exiting Master process...
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [WARNING]  (219343) : Exiting Master process...
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [ALERT]    (219343) : Current worker (219345) exited with code 143 (Terminated)
Dec 11 05:02:19 np0005555140 neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f[219339]: [WARNING]  (219343) : All workers exited. Exiting... (0)
Dec 11 05:02:19 np0005555140 systemd[1]: libpod-c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d.scope: Deactivated successfully.
Dec 11 05:02:19 np0005555140 conmon[219339]: conmon c5e0baa5aadfc2dbcbab <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d.scope/container/memory.events
Dec 11 05:02:19 np0005555140 podman[219493]: 2025-12-11 10:02:19.873463642 +0000 UTC m=+0.267561745 container died c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:02:19 np0005555140 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d-userdata-shm.mount: Deactivated successfully.
Dec 11 05:02:19 np0005555140 systemd[1]: var-lib-containers-storage-overlay-0ef470c4c63710a5e5e4ede4f778e165b355fe949a00039fd44b191499165c91-merged.mount: Deactivated successfully.
Dec 11 05:02:19 np0005555140 podman[219493]: 2025-12-11 10:02:19.921824708 +0000 UTC m=+0.315922801 container cleanup c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 05:02:19 np0005555140 systemd[1]: libpod-conmon-c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d.scope: Deactivated successfully.
Dec 11 05:02:19 np0005555140 podman[219541]: 2025-12-11 10:02:19.988086126 +0000 UTC m=+0.045997549 container remove c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.993 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[f9339453-ddca-41ee-8088-638ebc227c30]: (4, ('Thu Dec 11 10:02:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f (c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d)\nc5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d\nThu Dec 11 10:02:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f (c5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d)\nc5e0baa5aadfc2dbcbab31af0f2c0fb2258fbf313ad016dcb6ba80a3cc2bfc4d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.994 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e4de11-3937-4cb9-9678-e66fc50457cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:19 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:19.995 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6329c3dd-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:02:19 np0005555140 nova_compute[187006]: 2025-12-11 10:02:19.997 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:19 np0005555140 kernel: tap6329c3dd-80: left promiscuous mode
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.008 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.011 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[df154f7d-7454-4875-a4f6-2038977caf2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.030 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[3fda803f-155e-4f00-9c01-12d3cb985914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.031 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[d985aa03-aab3-44c9-9505-79003b3aef4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.046 213337 DEBUG oslo.privsep.daemon [-] privsep: reply[6267e504-8a07-4eba-93ae-4af5568e6700]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367485, 'reachable_time': 43114, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219556, 'error': None, 'target': 'ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:20 np0005555140 systemd[1]: run-netns-ovnmeta\x2d6329c3dd\x2d8512\x2d4801\x2d95ae\x2da4417217513f.mount: Deactivated successfully.
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.049 104402 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6329c3dd-8512-4801-95ae-a4417217513f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 05:02:20 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:20.049 104402 DEBUG oslo.privsep.daemon [-] privsep: reply[86f8166f-8dda-4693-956e-f1046cf37aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.525 187010 DEBUG nova.network.neutron [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.553 187010 INFO nova.compute.manager [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Took 0.74 seconds to deallocate network for instance.#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.608 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.608 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.674 187010 DEBUG nova.compute.provider_tree [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.699 187010 DEBUG nova.scheduler.client.report [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.719 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.749 187010 INFO nova.scheduler.client.report [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Deleted allocations for instance 5237d116-713e-4af3-822e-8ce58f99769b#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.818 187010 DEBUG oslo_concurrency.lockutils [None req-4f25da67-e606-48f0-93ce-12f9af52843d 277eaa28c80b403abb371276e6721821 da6c6741ea8e45ae95d918e6da5f248b - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.955 187010 DEBUG nova.network.neutron [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updated VIF entry in instance network info cache for port daea0190-bb25-40bb-be7e-c2d97807d3f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.956 187010 DEBUG nova.network.neutron [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Updating instance_info_cache with network_info: [{"id": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "address": "fa:16:3e:0b:22:fb", "network": {"id": "6329c3dd-8512-4801-95ae-a4417217513f", "bridge": "br-int", "label": "tempest-network-smoke--1140463982", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "da6c6741ea8e45ae95d918e6da5f248b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaea0190-bb", "ovs_interfaceid": "daea0190-bb25-40bb-be7e-c2d97807d3f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 05:02:20 np0005555140 nova_compute[187006]: 2025-12-11 10:02:20.986 187010 DEBUG oslo_concurrency.lockutils [req-ea0d2ab0-5c3a-47c5-9611-7c8788d8b239 req-d5074d6b-b55e-4ff8-8432-acf93ca123f4 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Releasing lock "refresh_cache-5237d116-713e-4af3-822e-8ce58f99769b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.847 187010 DEBUG nova.compute.manager [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.848 187010 DEBUG oslo_concurrency.lockutils [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Acquiring lock "5237d116-713e-4af3-822e-8ce58f99769b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.848 187010 DEBUG oslo_concurrency.lockutils [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.848 187010 DEBUG oslo_concurrency.lockutils [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] Lock "5237d116-713e-4af3-822e-8ce58f99769b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.848 187010 DEBUG nova.compute.manager [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] No waiting events found dispatching network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.849 187010 WARNING nova.compute.manager [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received unexpected event network-vif-plugged-daea0190-bb25-40bb-be7e-c2d97807d3f3 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.849 187010 DEBUG nova.compute.manager [req-0256c5f4-0e51-4776-9e30-bad6ca2a2a6e req-2f06d25c-0005-475b-a829-7dd12da0d789 b19233ef32904ef3bc35a60ba38a1251 5acdff57a91b450b943160bcb75969cd - - default default] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Received event network-vif-deleted-daea0190-bb25-40bb-be7e-c2d97807d3f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 05:02:21 np0005555140 nova_compute[187006]: 2025-12-11 10:02:21.913 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:24 np0005555140 podman[219557]: 2025-12-11 10:02:24.672192602 +0000 UTC m=+0.049725825 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:02:24 np0005555140 nova_compute[187006]: 2025-12-11 10:02:24.743 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:25 np0005555140 nova_compute[187006]: 2025-12-11 10:02:25.582 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:25 np0005555140 nova_compute[187006]: 2025-12-11 10:02:25.655 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:26 np0005555140 podman[219583]: 2025-12-11 10:02:26.704120466 +0000 UTC m=+0.072418566 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Dec 11 05:02:26 np0005555140 podman[219582]: 2025-12-11 10:02:26.748217749 +0000 UTC m=+0.119795403 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 05:02:26 np0005555140 nova_compute[187006]: 2025-12-11 10:02:26.915 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:27 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:27.415 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:02:29 np0005555140 nova_compute[187006]: 2025-12-11 10:02:29.745 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:31 np0005555140 nova_compute[187006]: 2025-12-11 10:02:31.917 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:34 np0005555140 nova_compute[187006]: 2025-12-11 10:02:34.717 187010 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765447339.7156096, 5237d116-713e-4af3-822e-8ce58f99769b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 05:02:34 np0005555140 nova_compute[187006]: 2025-12-11 10:02:34.717 187010 INFO nova.compute.manager [-] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] VM Stopped (Lifecycle Event)#033[00m
Dec 11 05:02:34 np0005555140 nova_compute[187006]: 2025-12-11 10:02:34.742 187010 DEBUG nova.compute.manager [None req-9937cb8e-e940-43f2-b47b-17caf724ec1b - - - - - -] [instance: 5237d116-713e-4af3-822e-8ce58f99769b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 05:02:34 np0005555140 nova_compute[187006]: 2025-12-11 10:02:34.747 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:36 np0005555140 nova_compute[187006]: 2025-12-11 10:02:36.918 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:37 np0005555140 podman[219627]: 2025-12-11 10:02:37.701102868 +0000 UTC m=+0.076494703 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:02:39 np0005555140 nova_compute[187006]: 2025-12-11 10:02:39.750 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:41 np0005555140 nova_compute[187006]: 2025-12-11 10:02:41.919 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:44 np0005555140 nova_compute[187006]: 2025-12-11 10:02:44.758 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:45 np0005555140 podman[219651]: 2025-12-11 10:02:45.678669168 +0000 UTC m=+0.053621187 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:02:45 np0005555140 podman[219652]: 2025-12-11 10:02:45.687596594 +0000 UTC m=+0.057520339 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 05:02:46 np0005555140 nova_compute[187006]: 2025-12-11 10:02:46.922 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:48.629 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:02:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:48.629 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:02:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:02:48.629 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:02:49 np0005555140 podman[219689]: 2025-12-11 10:02:49.680081084 +0000 UTC m=+0.050781825 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 05:02:49 np0005555140 nova_compute[187006]: 2025-12-11 10:02:49.759 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:51 np0005555140 nova_compute[187006]: 2025-12-11 10:02:51.922 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:54 np0005555140 nova_compute[187006]: 2025-12-11 10:02:54.762 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:55 np0005555140 podman[219709]: 2025-12-11 10:02:55.673740797 +0000 UTC m=+0.048800869 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:02:56 np0005555140 ovn_controller[95438]: 2025-12-11T10:02:56Z|00181|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 11 05:02:56 np0005555140 nova_compute[187006]: 2025-12-11 10:02:56.924 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:02:57 np0005555140 podman[219736]: 2025-12-11 10:02:57.702064397 +0000 UTC m=+0.062546503 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 11 05:02:57 np0005555140 podman[219735]: 2025-12-11 10:02:57.729783521 +0000 UTC m=+0.092223203 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 05:02:57 np0005555140 nova_compute[187006]: 2025-12-11 10:02:57.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:58 np0005555140 nova_compute[187006]: 2025-12-11 10:02:58.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:58 np0005555140 nova_compute[187006]: 2025-12-11 10:02:58.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:02:59 np0005555140 nova_compute[187006]: 2025-12-11 10:02:59.763 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:00 np0005555140 nova_compute[187006]: 2025-12-11 10:03:00.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:00 np0005555140 nova_compute[187006]: 2025-12-11 10:03:00.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:03:00 np0005555140 nova_compute[187006]: 2025-12-11 10:03:00.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:03:00 np0005555140 nova_compute[187006]: 2025-12-11 10:03:00.851 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:03:01 np0005555140 nova_compute[187006]: 2025-12-11 10:03:01.925 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:02 np0005555140 nova_compute[187006]: 2025-12-11 10:03:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:02 np0005555140 nova_compute[187006]: 2025-12-11 10:03:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:02 np0005555140 nova_compute[187006]: 2025-12-11 10:03:02.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:03:03 np0005555140 nova_compute[187006]: 2025-12-11 10:03:03.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.766 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.858 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.859 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.859 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:03:04 np0005555140 nova_compute[187006]: 2025-12-11 10:03:04.860 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.069 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.070 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5770MB free_disk=73.32830810546875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.070 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.071 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.124 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.124 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.139 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing inventories for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.155 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating ProviderTree inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.155 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.173 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing aggregate associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.200 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing trait associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,HW_CPU_X86_SVM,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.238 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.269 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.290 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:03:05 np0005555140 nova_compute[187006]: 2025-12-11 10:03:05.290 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:03:06 np0005555140 nova_compute[187006]: 2025-12-11 10:03:06.928 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:08 np0005555140 nova_compute[187006]: 2025-12-11 10:03:08.286 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:08 np0005555140 podman[219785]: 2025-12-11 10:03:08.705624479 +0000 UTC m=+0.064383676 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 05:03:09 np0005555140 nova_compute[187006]: 2025-12-11 10:03:09.770 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:09 np0005555140 nova_compute[187006]: 2025-12-11 10:03:09.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:11 np0005555140 nova_compute[187006]: 2025-12-11 10:03:11.930 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:14 np0005555140 nova_compute[187006]: 2025-12-11 10:03:14.771 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:16 np0005555140 podman[219810]: 2025-12-11 10:03:16.681796649 +0000 UTC m=+0.060229086 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:03:16 np0005555140 podman[219811]: 2025-12-11 10:03:16.682034356 +0000 UTC m=+0.056687965 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:03:16 np0005555140 nova_compute[187006]: 2025-12-11 10:03:16.931 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:19 np0005555140 nova_compute[187006]: 2025-12-11 10:03:19.773 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:20 np0005555140 podman[219849]: 2025-12-11 10:03:20.710479706 +0000 UTC m=+0.074038412 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 05:03:21 np0005555140 nova_compute[187006]: 2025-12-11 10:03:21.935 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:24 np0005555140 nova_compute[187006]: 2025-12-11 10:03:24.775 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:24 np0005555140 systemd-logind[787]: New session 26 of user zuul.
Dec 11 05:03:24 np0005555140 systemd[1]: Started Session 26 of User zuul.
Dec 11 05:03:26 np0005555140 podman[219906]: 2025-12-11 10:03:26.182482933 +0000 UTC m=+0.061706219 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:03:26 np0005555140 nova_compute[187006]: 2025-12-11 10:03:26.936 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:28 np0005555140 podman[220057]: 2025-12-11 10:03:28.695394475 +0000 UTC m=+0.066758273 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 05:03:28 np0005555140 podman[220056]: 2025-12-11 10:03:28.718043804 +0000 UTC m=+0.096162786 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller)
Dec 11 05:03:29 np0005555140 nova_compute[187006]: 2025-12-11 10:03:29.777 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:31 np0005555140 nova_compute[187006]: 2025-12-11 10:03:31.939 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:31 np0005555140 ovs-vsctl[220154]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 11 05:03:32 np0005555140 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 219896 (sos)
Dec 11 05:03:32 np0005555140 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 11 05:03:32 np0005555140 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 11 05:03:32 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 11 05:03:32 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 11 05:03:32 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 05:03:34 np0005555140 nova_compute[187006]: 2025-12-11 10:03:34.780 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:36 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 05:03:36 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 05:03:36 np0005555140 nova_compute[187006]: 2025-12-11 10:03:36.939 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:39 np0005555140 podman[221063]: 2025-12-11 10:03:39.777129048 +0000 UTC m=+0.144775754 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:03:39 np0005555140 nova_compute[187006]: 2025-12-11 10:03:39.782 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:41 np0005555140 nova_compute[187006]: 2025-12-11 10:03:41.940 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:42 np0005555140 ovs-appctl[221820]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 05:03:42 np0005555140 ovs-appctl[221825]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 05:03:42 np0005555140 ovs-appctl[221831]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 05:03:44 np0005555140 nova_compute[187006]: 2025-12-11 10:03:44.784 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:46 np0005555140 nova_compute[187006]: 2025-12-11 10:03:46.942 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:47 np0005555140 podman[222823]: 2025-12-11 10:03:47.039712437 +0000 UTC m=+0.061435749 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:03:47 np0005555140 podman[222822]: 2025-12-11 10:03:47.061685786 +0000 UTC m=+0.083547142 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 11 05:03:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:03:48.629 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:03:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:03:48.631 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:03:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:03:48.631 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:03:49 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 05:03:49 np0005555140 nova_compute[187006]: 2025-12-11 10:03:49.786 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:03:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:03:50 np0005555140 podman[223268]: 2025-12-11 10:03:50.822656322 +0000 UTC m=+0.063241141 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 11 05:03:51 np0005555140 systemd[1]: Starting Time & Date Service...
Dec 11 05:03:51 np0005555140 systemd[1]: Started Time & Date Service.
Dec 11 05:03:51 np0005555140 nova_compute[187006]: 2025-12-11 10:03:51.944 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:54 np0005555140 nova_compute[187006]: 2025-12-11 10:03:54.788 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:56 np0005555140 podman[223366]: 2025-12-11 10:03:56.680623886 +0000 UTC m=+0.054570942 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 05:03:56 np0005555140 nova_compute[187006]: 2025-12-11 10:03:56.946 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:58 np0005555140 nova_compute[187006]: 2025-12-11 10:03:58.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:58 np0005555140 nova_compute[187006]: 2025-12-11 10:03:58.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:03:58 np0005555140 podman[223391]: 2025-12-11 10:03:58.881935211 +0000 UTC m=+0.066282308 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Dec 11 05:03:58 np0005555140 podman[223390]: 2025-12-11 10:03:58.922210413 +0000 UTC m=+0.110188044 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:03:59 np0005555140 nova_compute[187006]: 2025-12-11 10:03:59.790 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:03:59 np0005555140 nova_compute[187006]: 2025-12-11 10:03:59.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:01 np0005555140 nova_compute[187006]: 2025-12-11 10:04:01.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:01 np0005555140 nova_compute[187006]: 2025-12-11 10:04:01.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:04:01 np0005555140 nova_compute[187006]: 2025-12-11 10:04:01.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:04:01 np0005555140 nova_compute[187006]: 2025-12-11 10:04:01.847 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:04:01 np0005555140 nova_compute[187006]: 2025-12-11 10:04:01.947 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:02 np0005555140 nova_compute[187006]: 2025-12-11 10:04:02.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:02 np0005555140 nova_compute[187006]: 2025-12-11 10:04:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:02 np0005555140 nova_compute[187006]: 2025-12-11 10:04:02.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:04:03 np0005555140 nova_compute[187006]: 2025-12-11 10:04:03.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.792 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.858 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.858 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.859 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:04:04 np0005555140 nova_compute[187006]: 2025-12-11 10:04:04.859 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.013 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.014 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5297MB free_disk=73.06671142578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.015 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.015 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.090 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.090 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.114 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.129 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.130 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:04:05 np0005555140 nova_compute[187006]: 2025-12-11 10:04:05.130 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:04:06 np0005555140 nova_compute[187006]: 2025-12-11 10:04:06.950 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:09 np0005555140 nova_compute[187006]: 2025-12-11 10:04:09.794 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:10 np0005555140 podman[223437]: 2025-12-11 10:04:10.71147657 +0000 UTC m=+0.081802230 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:04:11 np0005555140 nova_compute[187006]: 2025-12-11 10:04:11.130 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:04:11 np0005555140 nova_compute[187006]: 2025-12-11 10:04:11.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:12 np0005555140 systemd[1]: session-26.scope: Deactivated successfully.
Dec 11 05:04:12 np0005555140 systemd[1]: session-26.scope: Consumed 1min 15.211s CPU time, 500.2M memory peak, read 103.0M from disk, written 23.7M to disk.
Dec 11 05:04:12 np0005555140 systemd-logind[787]: Session 26 logged out. Waiting for processes to exit.
Dec 11 05:04:12 np0005555140 systemd-logind[787]: Removed session 26.
Dec 11 05:04:12 np0005555140 systemd-logind[787]: New session 27 of user zuul.
Dec 11 05:04:12 np0005555140 systemd[1]: Started Session 27 of User zuul.
Dec 11 05:04:13 np0005555140 systemd[1]: session-27.scope: Deactivated successfully.
Dec 11 05:04:13 np0005555140 systemd-logind[787]: Session 27 logged out. Waiting for processes to exit.
Dec 11 05:04:13 np0005555140 systemd-logind[787]: Removed session 27.
Dec 11 05:04:13 np0005555140 systemd-logind[787]: New session 28 of user zuul.
Dec 11 05:04:13 np0005555140 systemd[1]: Started Session 28 of User zuul.
Dec 11 05:04:13 np0005555140 systemd[1]: session-28.scope: Deactivated successfully.
Dec 11 05:04:13 np0005555140 systemd-logind[787]: Session 28 logged out. Waiting for processes to exit.
Dec 11 05:04:13 np0005555140 systemd-logind[787]: Removed session 28.
Dec 11 05:04:14 np0005555140 nova_compute[187006]: 2025-12-11 10:04:14.796 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:16 np0005555140 nova_compute[187006]: 2025-12-11 10:04:16.952 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:17 np0005555140 podman[223520]: 2025-12-11 10:04:17.710648734 +0000 UTC m=+0.075747549 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:04:17 np0005555140 podman[223521]: 2025-12-11 10:04:17.723304426 +0000 UTC m=+0.092777456 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 05:04:19 np0005555140 nova_compute[187006]: 2025-12-11 10:04:19.798 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:21 np0005555140 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 05:04:21 np0005555140 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 05:04:21 np0005555140 podman[223561]: 2025-12-11 10:04:21.692652825 +0000 UTC m=+0.071577648 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 05:04:21 np0005555140 nova_compute[187006]: 2025-12-11 10:04:21.953 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:24 np0005555140 nova_compute[187006]: 2025-12-11 10:04:24.801 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:26 np0005555140 nova_compute[187006]: 2025-12-11 10:04:26.956 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:27 np0005555140 podman[223584]: 2025-12-11 10:04:27.686666873 +0000 UTC m=+0.062682144 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:04:29 np0005555140 podman[223608]: 2025-12-11 10:04:29.706650527 +0000 UTC m=+0.081298007 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Dec 11 05:04:29 np0005555140 podman[223609]: 2025-12-11 10:04:29.707706167 +0000 UTC m=+0.082824031 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Dec 11 05:04:29 np0005555140 nova_compute[187006]: 2025-12-11 10:04:29.802 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:31 np0005555140 nova_compute[187006]: 2025-12-11 10:04:31.958 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:34 np0005555140 nova_compute[187006]: 2025-12-11 10:04:34.804 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:36 np0005555140 nova_compute[187006]: 2025-12-11 10:04:36.960 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:39 np0005555140 nova_compute[187006]: 2025-12-11 10:04:39.807 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:41 np0005555140 podman[223653]: 2025-12-11 10:04:41.688774485 +0000 UTC m=+0.060101631 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:04:41 np0005555140 nova_compute[187006]: 2025-12-11 10:04:41.962 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:44 np0005555140 nova_compute[187006]: 2025-12-11 10:04:44.812 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:46 np0005555140 nova_compute[187006]: 2025-12-11 10:04:46.964 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:04:48.631 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:04:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:04:48.631 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:04:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:04:48.632 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:04:48 np0005555140 podman[223679]: 2025-12-11 10:04:48.724893346 +0000 UTC m=+0.085730404 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 05:04:48 np0005555140 podman[223680]: 2025-12-11 10:04:48.737029173 +0000 UTC m=+0.100792925 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 11 05:04:49 np0005555140 nova_compute[187006]: 2025-12-11 10:04:49.815 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:51 np0005555140 nova_compute[187006]: 2025-12-11 10:04:51.964 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:52 np0005555140 podman[223720]: 2025-12-11 10:04:52.691414212 +0000 UTC m=+0.061803520 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Dec 11 05:04:54 np0005555140 nova_compute[187006]: 2025-12-11 10:04:54.818 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:56 np0005555140 nova_compute[187006]: 2025-12-11 10:04:56.966 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:58 np0005555140 podman[223741]: 2025-12-11 10:04:58.689423543 +0000 UTC m=+0.062702995 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:04:59 np0005555140 nova_compute[187006]: 2025-12-11 10:04:59.822 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:04:59 np0005555140 nova_compute[187006]: 2025-12-11 10:04:59.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:00 np0005555140 podman[223767]: 2025-12-11 10:05:00.694989066 +0000 UTC m=+0.067648746 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Dec 11 05:05:00 np0005555140 podman[223766]: 2025-12-11 10:05:00.7143429 +0000 UTC m=+0.091271322 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 05:05:00 np0005555140 nova_compute[187006]: 2025-12-11 10:05:00.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:00 np0005555140 nova_compute[187006]: 2025-12-11 10:05:00.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:01 np0005555140 nova_compute[187006]: 2025-12-11 10:05:01.967 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:02 np0005555140 nova_compute[187006]: 2025-12-11 10:05:02.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:02 np0005555140 nova_compute[187006]: 2025-12-11 10:05:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:02 np0005555140 nova_compute[187006]: 2025-12-11 10:05:02.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:05:03 np0005555140 nova_compute[187006]: 2025-12-11 10:05:03.831 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:03 np0005555140 nova_compute[187006]: 2025-12-11 10:05:03.832 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:05:03 np0005555140 nova_compute[187006]: 2025-12-11 10:05:03.832 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:05:04 np0005555140 nova_compute[187006]: 2025-12-11 10:05:04.824 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:04 np0005555140 nova_compute[187006]: 2025-12-11 10:05:04.925 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:05:05 np0005555140 nova_compute[187006]: 2025-12-11 10:05:05.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.867 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.869 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.869 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.870 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:05:06 np0005555140 nova_compute[187006]: 2025-12-11 10:05:06.968 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.061 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.062 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5550MB free_disk=73.32785415649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.062 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.063 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.155 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.155 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.206 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.230 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.231 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:05:07 np0005555140 nova_compute[187006]: 2025-12-11 10:05:07.231 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:05:09 np0005555140 nova_compute[187006]: 2025-12-11 10:05:09.829 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:11 np0005555140 nova_compute[187006]: 2025-12-11 10:05:11.969 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:12 np0005555140 podman[223813]: 2025-12-11 10:05:12.67708199 +0000 UTC m=+0.051635218 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:05:13 np0005555140 nova_compute[187006]: 2025-12-11 10:05:13.226 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:13 np0005555140 nova_compute[187006]: 2025-12-11 10:05:13.358 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:14 np0005555140 nova_compute[187006]: 2025-12-11 10:05:14.832 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:16 np0005555140 nova_compute[187006]: 2025-12-11 10:05:16.971 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:19 np0005555140 podman[223838]: 2025-12-11 10:05:19.694846163 +0000 UTC m=+0.063949291 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 05:05:19 np0005555140 podman[223839]: 2025-12-11 10:05:19.705655583 +0000 UTC m=+0.065320641 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 11 05:05:19 np0005555140 nova_compute[187006]: 2025-12-11 10:05:19.834 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:21 np0005555140 nova_compute[187006]: 2025-12-11 10:05:21.974 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:23 np0005555140 podman[223878]: 2025-12-11 10:05:23.686191942 +0000 UTC m=+0.055094957 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:05:24 np0005555140 nova_compute[187006]: 2025-12-11 10:05:24.837 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:26 np0005555140 nova_compute[187006]: 2025-12-11 10:05:26.977 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:29 np0005555140 podman[223898]: 2025-12-11 10:05:29.681425976 +0000 UTC m=+0.059138172 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:05:29 np0005555140 nova_compute[187006]: 2025-12-11 10:05:29.839 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:31 np0005555140 podman[223923]: 2025-12-11 10:05:31.693159095 +0000 UTC m=+0.059596047 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 05:05:31 np0005555140 podman[223922]: 2025-12-11 10:05:31.721895427 +0000 UTC m=+0.090332896 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 05:05:31 np0005555140 nova_compute[187006]: 2025-12-11 10:05:31.978 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:34 np0005555140 nova_compute[187006]: 2025-12-11 10:05:34.841 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:36 np0005555140 nova_compute[187006]: 2025-12-11 10:05:36.979 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:39 np0005555140 nova_compute[187006]: 2025-12-11 10:05:39.843 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:41 np0005555140 nova_compute[187006]: 2025-12-11 10:05:41.981 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:43 np0005555140 podman[223969]: 2025-12-11 10:05:43.683014983 +0000 UTC m=+0.058263499 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:05:44 np0005555140 nova_compute[187006]: 2025-12-11 10:05:44.845 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:46 np0005555140 nova_compute[187006]: 2025-12-11 10:05:46.983 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:05:48.631 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:05:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:05:48.632 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:05:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:05:48.632 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:05:49 np0005555140 nova_compute[187006]: 2025-12-11 10:05:49.847 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:05:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:05:50 np0005555140 podman[223996]: 2025-12-11 10:05:50.683766517 +0000 UTC m=+0.058514668 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 05:05:50 np0005555140 podman[223995]: 2025-12-11 10:05:50.706351355 +0000 UTC m=+0.085979746 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:05:51 np0005555140 nova_compute[187006]: 2025-12-11 10:05:51.983 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:54 np0005555140 podman[224035]: 2025-12-11 10:05:54.676185369 +0000 UTC m=+0.050120878 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 11 05:05:54 np0005555140 nova_compute[187006]: 2025-12-11 10:05:54.849 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:56 np0005555140 nova_compute[187006]: 2025-12-11 10:05:56.984 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:05:59 np0005555140 nova_compute[187006]: 2025-12-11 10:05:59.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:05:59 np0005555140 nova_compute[187006]: 2025-12-11 10:05:59.851 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:00 np0005555140 podman[224054]: 2025-12-11 10:06:00.704076042 +0000 UTC m=+0.075118404 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:06:01 np0005555140 nova_compute[187006]: 2025-12-11 10:06:01.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:01 np0005555140 nova_compute[187006]: 2025-12-11 10:06:01.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:01 np0005555140 nova_compute[187006]: 2025-12-11 10:06:01.986 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:02 np0005555140 podman[224079]: 2025-12-11 10:06:02.679857182 +0000 UTC m=+0.052962479 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Dec 11 05:06:02 np0005555140 podman[224078]: 2025-12-11 10:06:02.700647308 +0000 UTC m=+0.073942180 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 05:06:03 np0005555140 nova_compute[187006]: 2025-12-11 10:06:03.825 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:03 np0005555140 nova_compute[187006]: 2025-12-11 10:06:03.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:03 np0005555140 nova_compute[187006]: 2025-12-11 10:06:03.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:06:04 np0005555140 nova_compute[187006]: 2025-12-11 10:06:04.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:04 np0005555140 nova_compute[187006]: 2025-12-11 10:06:04.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:06:04 np0005555140 nova_compute[187006]: 2025-12-11 10:06:04.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:06:04 np0005555140 nova_compute[187006]: 2025-12-11 10:06:04.852 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:06:04 np0005555140 nova_compute[187006]: 2025-12-11 10:06:04.854 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:05 np0005555140 nova_compute[187006]: 2025-12-11 10:06:05.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:06 np0005555140 nova_compute[187006]: 2025-12-11 10:06:06.987 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:07 np0005555140 nova_compute[187006]: 2025-12-11 10:06:07.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:07 np0005555140 nova_compute[187006]: 2025-12-11 10:06:07.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:06:07 np0005555140 nova_compute[187006]: 2025-12-11 10:06:07.874 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:06:07 np0005555140 nova_compute[187006]: 2025-12-11 10:06:07.874 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:06:07 np0005555140 nova_compute[187006]: 2025-12-11 10:06:07.875 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.021 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.022 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5620MB free_disk=73.32817459106445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.022 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.022 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.107 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.108 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.132 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.156 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.157 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:06:08 np0005555140 nova_compute[187006]: 2025-12-11 10:06:08.158 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:06:09 np0005555140 nova_compute[187006]: 2025-12-11 10:06:09.856 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:11 np0005555140 nova_compute[187006]: 2025-12-11 10:06:11.989 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:14 np0005555140 podman[224125]: 2025-12-11 10:06:14.674131267 +0000 UTC m=+0.050505799 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:06:14 np0005555140 nova_compute[187006]: 2025-12-11 10:06:14.858 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:15 np0005555140 nova_compute[187006]: 2025-12-11 10:06:15.158 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:16 np0005555140 nova_compute[187006]: 2025-12-11 10:06:16.991 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:19 np0005555140 nova_compute[187006]: 2025-12-11 10:06:19.860 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:21 np0005555140 podman[224149]: 2025-12-11 10:06:21.703906592 +0000 UTC m=+0.072941322 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 05:06:21 np0005555140 podman[224150]: 2025-12-11 10:06:21.718708186 +0000 UTC m=+0.091248307 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:06:21 np0005555140 nova_compute[187006]: 2025-12-11 10:06:21.992 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:24 np0005555140 nova_compute[187006]: 2025-12-11 10:06:24.863 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:25 np0005555140 podman[224186]: 2025-12-11 10:06:25.708644827 +0000 UTC m=+0.077385219 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:06:26 np0005555140 nova_compute[187006]: 2025-12-11 10:06:26.996 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:29 np0005555140 nova_compute[187006]: 2025-12-11 10:06:29.864 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:31 np0005555140 podman[224205]: 2025-12-11 10:06:31.699779446 +0000 UTC m=+0.079677845 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 05:06:31 np0005555140 nova_compute[187006]: 2025-12-11 10:06:31.997 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:33 np0005555140 podman[224231]: 2025-12-11 10:06:33.68266631 +0000 UTC m=+0.058023594 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 05:06:33 np0005555140 podman[224230]: 2025-12-11 10:06:33.708531612 +0000 UTC m=+0.087279243 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 05:06:34 np0005555140 nova_compute[187006]: 2025-12-11 10:06:34.866 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:36 np0005555140 nova_compute[187006]: 2025-12-11 10:06:36.998 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:39 np0005555140 nova_compute[187006]: 2025-12-11 10:06:39.868 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:42 np0005555140 nova_compute[187006]: 2025-12-11 10:06:42.000 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:44 np0005555140 nova_compute[187006]: 2025-12-11 10:06:44.870 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:45 np0005555140 podman[224275]: 2025-12-11 10:06:45.66561648 +0000 UTC m=+0.045305481 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:06:47 np0005555140 nova_compute[187006]: 2025-12-11 10:06:47.003 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:06:48.633 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:06:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:06:48.633 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:06:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:06:48.633 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:06:49 np0005555140 nova_compute[187006]: 2025-12-11 10:06:49.873 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:52 np0005555140 nova_compute[187006]: 2025-12-11 10:06:52.005 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:52 np0005555140 podman[224300]: 2025-12-11 10:06:52.677729318 +0000 UTC m=+0.055354328 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:06:52 np0005555140 podman[224301]: 2025-12-11 10:06:52.6777994 +0000 UTC m=+0.052134335 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:06:54 np0005555140 nova_compute[187006]: 2025-12-11 10:06:54.875 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:56 np0005555140 podman[224343]: 2025-12-11 10:06:56.706939276 +0000 UTC m=+0.085245064 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 11 05:06:57 np0005555140 nova_compute[187006]: 2025-12-11 10:06:57.007 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:06:57 np0005555140 nova_compute[187006]: 2025-12-11 10:06:57.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:57 np0005555140 nova_compute[187006]: 2025-12-11 10:06:57.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 05:06:59 np0005555140 nova_compute[187006]: 2025-12-11 10:06:59.855 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:06:59 np0005555140 nova_compute[187006]: 2025-12-11 10:06:59.878 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:02 np0005555140 nova_compute[187006]: 2025-12-11 10:07:02.008 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:02 np0005555140 podman[224363]: 2025-12-11 10:07:02.66784664 +0000 UTC m=+0.046545135 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:07:02 np0005555140 nova_compute[187006]: 2025-12-11 10:07:02.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:02 np0005555140 nova_compute[187006]: 2025-12-11 10:07:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:03 np0005555140 nova_compute[187006]: 2025-12-11 10:07:03.844 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:03 np0005555140 nova_compute[187006]: 2025-12-11 10:07:03.844 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:04 np0005555140 podman[224389]: 2025-12-11 10:07:04.679626043 +0000 UTC m=+0.050411936 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Dec 11 05:07:04 np0005555140 podman[224388]: 2025-12-11 10:07:04.704744454 +0000 UTC m=+0.079388477 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 05:07:04 np0005555140 nova_compute[187006]: 2025-12-11 10:07:04.880 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:05 np0005555140 nova_compute[187006]: 2025-12-11 10:07:05.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:05 np0005555140 nova_compute[187006]: 2025-12-11 10:07:05.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:07:06 np0005555140 nova_compute[187006]: 2025-12-11 10:07:06.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:06 np0005555140 nova_compute[187006]: 2025-12-11 10:07:06.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:07:06 np0005555140 nova_compute[187006]: 2025-12-11 10:07:06.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:07:06 np0005555140 nova_compute[187006]: 2025-12-11 10:07:06.845 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:07:07 np0005555140 nova_compute[187006]: 2025-12-11 10:07:07.010 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:07 np0005555140 nova_compute[187006]: 2025-12-11 10:07:07.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:08 np0005555140 nova_compute[187006]: 2025-12-11 10:07:08.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:08 np0005555140 nova_compute[187006]: 2025-12-11 10:07:08.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:07:08 np0005555140 nova_compute[187006]: 2025-12-11 10:07:08.874 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:07:08 np0005555140 nova_compute[187006]: 2025-12-11 10:07:08.874 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:07:08 np0005555140 nova_compute[187006]: 2025-12-11 10:07:08.874 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.032 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.033 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.32817459106445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.034 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.034 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.360 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.360 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.496 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.515 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.516 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.517 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.517 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.517 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.530 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 05:07:09 np0005555140 nova_compute[187006]: 2025-12-11 10:07:09.882 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:12 np0005555140 nova_compute[187006]: 2025-12-11 10:07:12.012 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:14 np0005555140 nova_compute[187006]: 2025-12-11 10:07:14.885 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:15 np0005555140 nova_compute[187006]: 2025-12-11 10:07:15.531 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:16 np0005555140 podman[224433]: 2025-12-11 10:07:16.676791641 +0000 UTC m=+0.048089340 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 05:07:17 np0005555140 nova_compute[187006]: 2025-12-11 10:07:17.014 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:17 np0005555140 nova_compute[187006]: 2025-12-11 10:07:17.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:19 np0005555140 nova_compute[187006]: 2025-12-11 10:07:19.887 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:20 np0005555140 nova_compute[187006]: 2025-12-11 10:07:20.990 187010 DEBUG oslo_concurrency.processutils [None req-96476cb3-d560-4888-ab5f-36c00e23f454 56b19a2ceea54f3e9a8cb95455546a47 41851b7da4be4b6e8751110baa8eccbc - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 05:07:21 np0005555140 nova_compute[187006]: 2025-12-11 10:07:21.011 187010 DEBUG oslo_concurrency.processutils [None req-96476cb3-d560-4888-ab5f-36c00e23f454 56b19a2ceea54f3e9a8cb95455546a47 41851b7da4be4b6e8751110baa8eccbc - - default default] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 05:07:22 np0005555140 nova_compute[187006]: 2025-12-11 10:07:22.015 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:23 np0005555140 podman[224460]: 2025-12-11 10:07:23.696065144 +0000 UTC m=+0.066396844 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 05:07:23 np0005555140 podman[224461]: 2025-12-11 10:07:23.702001634 +0000 UTC m=+0.069910855 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 05:07:24 np0005555140 nova_compute[187006]: 2025-12-11 10:07:24.889 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:26.901 104288 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '96:b3:9a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '12:7d:c7:e2:81:6f'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 05:07:26 np0005555140 nova_compute[187006]: 2025-12-11 10:07:26.901 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:26 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:26.902 104288 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 05:07:27 np0005555140 nova_compute[187006]: 2025-12-11 10:07:27.017 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:27 np0005555140 podman[224501]: 2025-12-11 10:07:27.675751541 +0000 UTC m=+0.054849603 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 05:07:29 np0005555140 nova_compute[187006]: 2025-12-11 10:07:29.892 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:30 np0005555140 nova_compute[187006]: 2025-12-11 10:07:30.741 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:07:32 np0005555140 nova_compute[187006]: 2025-12-11 10:07:32.019 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:32 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:32.904 104288 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f07ba53-a431-4669-9e8c-dcf2fed72095, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 05:07:33 np0005555140 podman[224519]: 2025-12-11 10:07:33.731464562 +0000 UTC m=+0.087870010 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:07:34 np0005555140 nova_compute[187006]: 2025-12-11 10:07:34.894 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:35 np0005555140 podman[224544]: 2025-12-11 10:07:35.687550577 +0000 UTC m=+0.062756450 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter)
Dec 11 05:07:35 np0005555140 podman[224543]: 2025-12-11 10:07:35.699763287 +0000 UTC m=+0.079457149 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 11 05:07:37 np0005555140 nova_compute[187006]: 2025-12-11 10:07:37.020 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:39 np0005555140 nova_compute[187006]: 2025-12-11 10:07:39.896 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:42 np0005555140 nova_compute[187006]: 2025-12-11 10:07:42.021 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:44 np0005555140 nova_compute[187006]: 2025-12-11 10:07:44.898 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:47 np0005555140 nova_compute[187006]: 2025-12-11 10:07:47.023 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:47 np0005555140 podman[224591]: 2025-12-11 10:07:47.707759614 +0000 UTC m=+0.079270384 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 05:07:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:48.635 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:07:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:48.635 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:07:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:07:48.635 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:07:49 np0005555140 nova_compute[187006]: 2025-12-11 10:07:49.901 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:07:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:07:52 np0005555140 nova_compute[187006]: 2025-12-11 10:07:52.024 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:54 np0005555140 podman[224615]: 2025-12-11 10:07:54.720351137 +0000 UTC m=+0.083734722 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 05:07:54 np0005555140 podman[224616]: 2025-12-11 10:07:54.750676026 +0000 UTC m=+0.108900053 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 11 05:07:54 np0005555140 nova_compute[187006]: 2025-12-11 10:07:54.903 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:57 np0005555140 nova_compute[187006]: 2025-12-11 10:07:57.027 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:07:58 np0005555140 podman[224652]: 2025-12-11 10:07:58.718464583 +0000 UTC m=+0.084612697 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 05:07:59 np0005555140 nova_compute[187006]: 2025-12-11 10:07:59.905 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:01 np0005555140 nova_compute[187006]: 2025-12-11 10:08:01.844 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:02 np0005555140 nova_compute[187006]: 2025-12-11 10:08:02.028 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:03 np0005555140 nova_compute[187006]: 2025-12-11 10:08:03.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:04 np0005555140 podman[224671]: 2025-12-11 10:08:04.670768038 +0000 UTC m=+0.048477451 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 05:08:04 np0005555140 nova_compute[187006]: 2025-12-11 10:08:04.825 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:04 np0005555140 nova_compute[187006]: 2025-12-11 10:08:04.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:04 np0005555140 nova_compute[187006]: 2025-12-11 10:08:04.908 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:06 np0005555140 podman[224696]: 2025-12-11 10:08:06.703877892 +0000 UTC m=+0.070609405 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 05:08:06 np0005555140 podman[224695]: 2025-12-11 10:08:06.724688919 +0000 UTC m=+0.088198999 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 11 05:08:07 np0005555140 nova_compute[187006]: 2025-12-11 10:08:07.030 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:07 np0005555140 nova_compute[187006]: 2025-12-11 10:08:07.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:07 np0005555140 nova_compute[187006]: 2025-12-11 10:08:07.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.844 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.845 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.872 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.873 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:08:08 np0005555140 nova_compute[187006]: 2025-12-11 10:08:08.873 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.033 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.035 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.32819366455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.035 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.035 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.105 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.105 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.169 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing inventories for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.189 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating ProviderTree inventory for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.190 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Updating inventory in ProviderTree for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.207 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing aggregate associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.225 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Refreshing trait associations for resource provider da0ef57a-f24e-4679-bba9-2f0d52d82a56, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_BMI,HW_CPU_X86_SVM,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.258 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.291 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.293 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.294 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:08:09 np0005555140 nova_compute[187006]: 2025-12-11 10:08:09.910 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:10 np0005555140 nova_compute[187006]: 2025-12-11 10:08:10.277 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:12 np0005555140 nova_compute[187006]: 2025-12-11 10:08:12.030 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:14 np0005555140 nova_compute[187006]: 2025-12-11 10:08:14.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:08:14 np0005555140 nova_compute[187006]: 2025-12-11 10:08:14.912 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:17 np0005555140 nova_compute[187006]: 2025-12-11 10:08:17.032 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:18 np0005555140 podman[224745]: 2025-12-11 10:08:18.673146107 +0000 UTC m=+0.046805323 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:08:19 np0005555140 nova_compute[187006]: 2025-12-11 10:08:19.915 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:22 np0005555140 nova_compute[187006]: 2025-12-11 10:08:22.034 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:24 np0005555140 nova_compute[187006]: 2025-12-11 10:08:24.917 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:25 np0005555140 podman[224770]: 2025-12-11 10:08:25.685959315 +0000 UTC m=+0.059557638 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Dec 11 05:08:25 np0005555140 podman[224771]: 2025-12-11 10:08:25.686229593 +0000 UTC m=+0.059413614 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm)
Dec 11 05:08:27 np0005555140 nova_compute[187006]: 2025-12-11 10:08:27.036 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:29 np0005555140 podman[224807]: 2025-12-11 10:08:29.68370648 +0000 UTC m=+0.064550362 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 05:08:29 np0005555140 nova_compute[187006]: 2025-12-11 10:08:29.919 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:32 np0005555140 nova_compute[187006]: 2025-12-11 10:08:32.037 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:34 np0005555140 nova_compute[187006]: 2025-12-11 10:08:34.922 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:35 np0005555140 podman[224828]: 2025-12-11 10:08:35.671184114 +0000 UTC m=+0.050841009 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:08:37 np0005555140 nova_compute[187006]: 2025-12-11 10:08:37.038 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:37 np0005555140 podman[224853]: 2025-12-11 10:08:37.686679323 +0000 UTC m=+0.059679892 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible)
Dec 11 05:08:37 np0005555140 podman[224852]: 2025-12-11 10:08:37.733011752 +0000 UTC m=+0.105406833 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 05:08:39 np0005555140 nova_compute[187006]: 2025-12-11 10:08:39.927 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:42 np0005555140 nova_compute[187006]: 2025-12-11 10:08:42.040 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:44 np0005555140 nova_compute[187006]: 2025-12-11 10:08:44.929 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:47 np0005555140 nova_compute[187006]: 2025-12-11 10:08:47.041 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:08:48.637 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:08:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:08:48.638 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:08:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:08:48.638 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:08:49 np0005555140 podman[224898]: 2025-12-11 10:08:49.674812651 +0000 UTC m=+0.053928747 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:08:49 np0005555140 nova_compute[187006]: 2025-12-11 10:08:49.931 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:52 np0005555140 nova_compute[187006]: 2025-12-11 10:08:52.042 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:54 np0005555140 nova_compute[187006]: 2025-12-11 10:08:54.933 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:56 np0005555140 podman[224923]: 2025-12-11 10:08:56.687875996 +0000 UTC m=+0.061720310 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 11 05:08:56 np0005555140 podman[224924]: 2025-12-11 10:08:56.692604031 +0000 UTC m=+0.061384820 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm)
Dec 11 05:08:57 np0005555140 nova_compute[187006]: 2025-12-11 10:08:57.043 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:08:59 np0005555140 nova_compute[187006]: 2025-12-11 10:08:59.935 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:00 np0005555140 podman[224964]: 2025-12-11 10:09:00.670533958 +0000 UTC m=+0.046084583 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 11 05:09:02 np0005555140 nova_compute[187006]: 2025-12-11 10:09:02.045 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:02 np0005555140 nova_compute[187006]: 2025-12-11 10:09:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:04 np0005555140 nova_compute[187006]: 2025-12-11 10:09:04.937 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:05 np0005555140 nova_compute[187006]: 2025-12-11 10:09:05.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:06 np0005555140 podman[224986]: 2025-12-11 10:09:06.683089833 +0000 UTC m=+0.053748182 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 05:09:06 np0005555140 nova_compute[187006]: 2025-12-11 10:09:06.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:06 np0005555140 nova_compute[187006]: 2025-12-11 10:09:06.827 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:07 np0005555140 nova_compute[187006]: 2025-12-11 10:09:07.048 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:08 np0005555140 podman[225011]: 2025-12-11 10:09:08.683347205 +0000 UTC m=+0.051561339 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Dec 11 05:09:08 np0005555140 podman[225010]: 2025-12-11 10:09:08.697808349 +0000 UTC m=+0.072636213 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 05:09:08 np0005555140 nova_compute[187006]: 2025-12-11 10:09:08.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:08 np0005555140 nova_compute[187006]: 2025-12-11 10:09:08.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.869 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.870 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.870 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.897 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.897 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.897 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.898 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:09:09 np0005555140 nova_compute[187006]: 2025-12-11 10:09:09.939 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.057 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.058 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.32817077636719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.058 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.059 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.121 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.121 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.151 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.175 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.176 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:09:10 np0005555140 nova_compute[187006]: 2025-12-11 10:09:10.176 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:09:12 np0005555140 nova_compute[187006]: 2025-12-11 10:09:12.049 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:14 np0005555140 nova_compute[187006]: 2025-12-11 10:09:14.942 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:16 np0005555140 nova_compute[187006]: 2025-12-11 10:09:16.135 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:17 np0005555140 nova_compute[187006]: 2025-12-11 10:09:17.053 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:19 np0005555140 nova_compute[187006]: 2025-12-11 10:09:19.825 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:09:19 np0005555140 nova_compute[187006]: 2025-12-11 10:09:19.944 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:20 np0005555140 podman[225057]: 2025-12-11 10:09:20.678673398 +0000 UTC m=+0.050296333 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:09:22 np0005555140 nova_compute[187006]: 2025-12-11 10:09:22.053 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:24 np0005555140 nova_compute[187006]: 2025-12-11 10:09:24.946 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:27 np0005555140 nova_compute[187006]: 2025-12-11 10:09:27.055 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:27 np0005555140 podman[225081]: 2025-12-11 10:09:27.676621871 +0000 UTC m=+0.055238175 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 05:09:27 np0005555140 podman[225082]: 2025-12-11 10:09:27.687979227 +0000 UTC m=+0.060083704 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm)
Dec 11 05:09:29 np0005555140 nova_compute[187006]: 2025-12-11 10:09:29.949 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:31 np0005555140 podman[225120]: 2025-12-11 10:09:31.679570495 +0000 UTC m=+0.052803715 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Dec 11 05:09:32 np0005555140 nova_compute[187006]: 2025-12-11 10:09:32.057 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:34 np0005555140 nova_compute[187006]: 2025-12-11 10:09:34.951 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:37 np0005555140 nova_compute[187006]: 2025-12-11 10:09:37.059 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:37 np0005555140 podman[225137]: 2025-12-11 10:09:37.695280829 +0000 UTC m=+0.071092309 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:09:39 np0005555140 podman[225162]: 2025-12-11 10:09:39.702743618 +0000 UTC m=+0.068923967 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 11 05:09:39 np0005555140 podman[225161]: 2025-12-11 10:09:39.718696296 +0000 UTC m=+0.095088527 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 05:09:39 np0005555140 nova_compute[187006]: 2025-12-11 10:09:39.952 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:42 np0005555140 nova_compute[187006]: 2025-12-11 10:09:42.061 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:44 np0005555140 nova_compute[187006]: 2025-12-11 10:09:44.959 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:47 np0005555140 nova_compute[187006]: 2025-12-11 10:09:47.065 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:09:48.639 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:09:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:09:48.639 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:09:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:09:48.639 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:09:49 np0005555140 nova_compute[187006]: 2025-12-11 10:09:49.961 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:09:50.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:09:51 np0005555140 podman[225208]: 2025-12-11 10:09:51.68370775 +0000 UTC m=+0.054110613 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:09:52 np0005555140 nova_compute[187006]: 2025-12-11 10:09:52.067 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:55 np0005555140 nova_compute[187006]: 2025-12-11 10:09:55.231 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:57 np0005555140 nova_compute[187006]: 2025-12-11 10:09:57.069 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:09:58 np0005555140 podman[225233]: 2025-12-11 10:09:58.69060227 +0000 UTC m=+0.059810596 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 05:09:58 np0005555140 podman[225232]: 2025-12-11 10:09:58.699404302 +0000 UTC m=+0.075214677 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 05:10:00 np0005555140 nova_compute[187006]: 2025-12-11 10:10:00.232 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:02 np0005555140 nova_compute[187006]: 2025-12-11 10:10:02.072 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:02 np0005555140 podman[225269]: 2025-12-11 10:10:02.672379117 +0000 UTC m=+0.052274050 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 11 05:10:02 np0005555140 nova_compute[187006]: 2025-12-11 10:10:02.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:05 np0005555140 nova_compute[187006]: 2025-12-11 10:10:05.235 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:06 np0005555140 nova_compute[187006]: 2025-12-11 10:10:06.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:07 np0005555140 nova_compute[187006]: 2025-12-11 10:10:07.077 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:07 np0005555140 nova_compute[187006]: 2025-12-11 10:10:07.825 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:08 np0005555140 podman[225289]: 2025-12-11 10:10:08.685777515 +0000 UTC m=+0.061649457 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:10:08 np0005555140 nova_compute[187006]: 2025-12-11 10:10:08.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:08 np0005555140 nova_compute[187006]: 2025-12-11 10:10:08.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:08 np0005555140 nova_compute[187006]: 2025-12-11 10:10:08.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:10:09 np0005555140 nova_compute[187006]: 2025-12-11 10:10:09.830 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:10 np0005555140 nova_compute[187006]: 2025-12-11 10:10:10.237 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:10 np0005555140 podman[225314]: 2025-12-11 10:10:10.679201138 +0000 UTC m=+0.049852652 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Dec 11 05:10:10 np0005555140 podman[225313]: 2025-12-11 10:10:10.709768559 +0000 UTC m=+0.085217219 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.829 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.829 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.830 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.848 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.849 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.875 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.875 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.876 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:10:11 np0005555140 nova_compute[187006]: 2025-12-11 10:10:11.876 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.014 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.015 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5710MB free_disk=73.32817077636719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.015 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.015 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.074 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.074 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.078 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.225 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.273 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.274 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:10:12 np0005555140 nova_compute[187006]: 2025-12-11 10:10:12.274 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:10:15 np0005555140 nova_compute[187006]: 2025-12-11 10:10:15.239 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:17 np0005555140 nova_compute[187006]: 2025-12-11 10:10:17.080 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:17 np0005555140 nova_compute[187006]: 2025-12-11 10:10:17.255 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:10:20 np0005555140 nova_compute[187006]: 2025-12-11 10:10:20.242 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:22 np0005555140 nova_compute[187006]: 2025-12-11 10:10:22.082 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:22 np0005555140 podman[225360]: 2025-12-11 10:10:22.662584284 +0000 UTC m=+0.043329346 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 05:10:25 np0005555140 nova_compute[187006]: 2025-12-11 10:10:25.244 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:27 np0005555140 nova_compute[187006]: 2025-12-11 10:10:27.083 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:29 np0005555140 podman[225384]: 2025-12-11 10:10:29.692011137 +0000 UTC m=+0.060465185 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 11 05:10:29 np0005555140 podman[225385]: 2025-12-11 10:10:29.717765901 +0000 UTC m=+0.082448271 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 05:10:30 np0005555140 nova_compute[187006]: 2025-12-11 10:10:30.246 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:32 np0005555140 nova_compute[187006]: 2025-12-11 10:10:32.085 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:33 np0005555140 podman[225424]: 2025-12-11 10:10:33.67004812 +0000 UTC m=+0.047123703 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 05:10:35 np0005555140 nova_compute[187006]: 2025-12-11 10:10:35.249 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:37 np0005555140 nova_compute[187006]: 2025-12-11 10:10:37.087 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:39 np0005555140 podman[225443]: 2025-12-11 10:10:39.689903457 +0000 UTC m=+0.070544802 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 05:10:40 np0005555140 nova_compute[187006]: 2025-12-11 10:10:40.250 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:41 np0005555140 podman[225468]: 2025-12-11 10:10:41.671420929 +0000 UTC m=+0.048024570 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350)
Dec 11 05:10:41 np0005555140 podman[225467]: 2025-12-11 10:10:41.693731775 +0000 UTC m=+0.073308301 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Dec 11 05:10:42 np0005555140 nova_compute[187006]: 2025-12-11 10:10:42.089 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:45 np0005555140 nova_compute[187006]: 2025-12-11 10:10:45.252 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:47 np0005555140 nova_compute[187006]: 2025-12-11 10:10:47.090 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:10:48.640 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:10:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:10:48.640 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:10:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:10:48.640 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:10:50 np0005555140 nova_compute[187006]: 2025-12-11 10:10:50.254 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:52 np0005555140 nova_compute[187006]: 2025-12-11 10:10:52.092 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:53 np0005555140 podman[225515]: 2025-12-11 10:10:53.674579251 +0000 UTC m=+0.054984577 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 05:10:55 np0005555140 nova_compute[187006]: 2025-12-11 10:10:55.256 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:10:57 np0005555140 nova_compute[187006]: 2025-12-11 10:10:57.094 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:00 np0005555140 nova_compute[187006]: 2025-12-11 10:11:00.258 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:00 np0005555140 podman[225539]: 2025-12-11 10:11:00.683542348 +0000 UTC m=+0.055533074 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Dec 11 05:11:00 np0005555140 podman[225540]: 2025-12-11 10:11:00.696087616 +0000 UTC m=+0.060284110 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 11 05:11:02 np0005555140 nova_compute[187006]: 2025-12-11 10:11:02.097 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:04 np0005555140 podman[225577]: 2025-12-11 10:11:04.672905744 +0000 UTC m=+0.050102269 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 05:11:04 np0005555140 nova_compute[187006]: 2025-12-11 10:11:04.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:05 np0005555140 nova_compute[187006]: 2025-12-11 10:11:05.261 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:06 np0005555140 nova_compute[187006]: 2025-12-11 10:11:06.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:07 np0005555140 nova_compute[187006]: 2025-12-11 10:11:07.098 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:08 np0005555140 nova_compute[187006]: 2025-12-11 10:11:08.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:09 np0005555140 nova_compute[187006]: 2025-12-11 10:11:09.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:09 np0005555140 nova_compute[187006]: 2025-12-11 10:11:09.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:10 np0005555140 nova_compute[187006]: 2025-12-11 10:11:10.263 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:10 np0005555140 podman[225597]: 2025-12-11 10:11:10.683801795 +0000 UTC m=+0.056526092 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 05:11:10 np0005555140 nova_compute[187006]: 2025-12-11 10:11:10.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:10 np0005555140 nova_compute[187006]: 2025-12-11 10:11:10.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.098 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:12 np0005555140 podman[225625]: 2025-12-11 10:11:12.684059063 +0000 UTC m=+0.057850661 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Dec 11 05:11:12 np0005555140 podman[225624]: 2025-12-11 10:11:12.706009518 +0000 UTC m=+0.083499661 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.828 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.828 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.842 187010 DEBUG nova.compute.manager [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.843 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.872 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.872 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.872 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.872 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.998 187010 WARNING nova.virt.libvirt.driver [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 05:11:12 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.999 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.32818603515625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.999 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:12.999 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.055 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.056 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.089 187010 DEBUG nova.compute.provider_tree [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed in ProviderTree for provider: da0ef57a-f24e-4679-bba9-2f0d52d82a56 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.103 187010 DEBUG nova.scheduler.client.report [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Inventory has not changed for provider da0ef57a-f24e-4679-bba9-2f0d52d82a56 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.104 187010 DEBUG nova.compute.resource_tracker [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 05:11:13 np0005555140 nova_compute[187006]: 2025-12-11 10:11:13.104 187010 DEBUG oslo_concurrency.lockutils [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:11:15 np0005555140 nova_compute[187006]: 2025-12-11 10:11:15.264 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:17 np0005555140 nova_compute[187006]: 2025-12-11 10:11:17.100 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:18 np0005555140 nova_compute[187006]: 2025-12-11 10:11:18.090 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:20 np0005555140 nova_compute[187006]: 2025-12-11 10:11:20.267 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:20 np0005555140 nova_compute[187006]: 2025-12-11 10:11:20.824 187010 DEBUG oslo_service.periodic_task [None req-6a9f4a80-bfbf-4246-ae8c-7e7bbe681831 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 05:11:22 np0005555140 nova_compute[187006]: 2025-12-11 10:11:22.103 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:24 np0005555140 podman[225672]: 2025-12-11 10:11:24.686822429 +0000 UTC m=+0.060095773 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 05:11:25 np0005555140 nova_compute[187006]: 2025-12-11 10:11:25.268 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:27 np0005555140 nova_compute[187006]: 2025-12-11 10:11:27.103 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:30 np0005555140 nova_compute[187006]: 2025-12-11 10:11:30.271 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:31 np0005555140 podman[225696]: 2025-12-11 10:11:31.681899753 +0000 UTC m=+0.058592092 container health_status 8334f91b31e7e6bd6b651a64ad808ee9194b3a6ad7ff2bee459eda93ad650248 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 05:11:31 np0005555140 podman[225697]: 2025-12-11 10:11:31.68391989 +0000 UTC m=+0.057259353 container health_status cb475d42660dadcaa9325110b904751b38281d0ee75e365f0f46a7014a216e29 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 05:11:32 np0005555140 nova_compute[187006]: 2025-12-11 10:11:32.105 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:35 np0005555140 nova_compute[187006]: 2025-12-11 10:11:35.273 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:35 np0005555140 podman[225736]: 2025-12-11 10:11:35.701917454 +0000 UTC m=+0.068876784 container health_status 9bd577a015e4f1d16d8de68c0bec359a3190f014cfa5a06896cd44c688fe8483 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 05:11:37 np0005555140 nova_compute[187006]: 2025-12-11 10:11:37.107 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:40 np0005555140 nova_compute[187006]: 2025-12-11 10:11:40.275 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:41 np0005555140 podman[225756]: 2025-12-11 10:11:41.673023279 +0000 UTC m=+0.045272779 container health_status 3779fefd362964b84f2ed202357ecc53d537f6f90c82d847f8159ad2e160a17a (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 05:11:42 np0005555140 nova_compute[187006]: 2025-12-11 10:11:42.108 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:43 np0005555140 podman[225781]: 2025-12-11 10:11:43.686799212 +0000 UTC m=+0.053790944 container health_status 9417d0bb22430c47c58ac7abfe7ea77527c3029753f580057383380f2841bd5d (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Dec 11 05:11:43 np0005555140 podman[225780]: 2025-12-11 10:11:43.691562938 +0000 UTC m=+0.067084483 container health_status 7da866eee003a4f3bbc57cc380676601e9870a91d9de18649168a62fc1a46659 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 05:11:45 np0005555140 nova_compute[187006]: 2025-12-11 10:11:45.275 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:47 np0005555140 nova_compute[187006]: 2025-12-11 10:11:47.110 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:11:48.641 104288 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 05:11:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:11:48.642 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 05:11:48 np0005555140 ovn_metadata_agent[104282]: 2025-12-11 10:11:48.642 104288 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 ceilometer_agent_compute[197744]: 2025-12-11 10:11:50.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 05:11:50 np0005555140 nova_compute[187006]: 2025-12-11 10:11:50.277 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:50 np0005555140 systemd-logind[787]: New session 29 of user zuul.
Dec 11 05:11:50 np0005555140 systemd[1]: Started Session 29 of User zuul.
Dec 11 05:11:52 np0005555140 nova_compute[187006]: 2025-12-11 10:11:52.111 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:54 np0005555140 ovs-vsctl[226004]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 11 05:11:55 np0005555140 nova_compute[187006]: 2025-12-11 10:11:55.278 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:55 np0005555140 podman[226079]: 2025-12-11 10:11:55.694959325 +0000 UTC m=+0.056268235 container health_status c8e5c8ff09c806ccbdd4a36f7c0172466b3ea0de18ad7134b9eda51b56f82f0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 05:11:55 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 11 05:11:55 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 11 05:11:55 np0005555140 virtqemud[186728]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 05:11:57 np0005555140 nova_compute[187006]: 2025-12-11 10:11:57.112 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 05:11:58 np0005555140 systemd[1]: Starting Hostname Service...
Dec 11 05:11:58 np0005555140 systemd[1]: Started Hostname Service.
Dec 11 05:12:00 np0005555140 nova_compute[187006]: 2025-12-11 10:12:00.280 187010 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
